brief introduction ： soon 2021 Years. ,Serverless Will it eventually replace microservices ？ From microservices to Serverless What kind of path does it take ？ This article will Serverless In depth comparison with micro services in terms of advantages and disadvantages .
“Serverless Can it replace microservices ？” This is knowledge Serverless Classified hot topics .
Some people say that micro service and Serverless It's a departure from , Although we can be based on Serverless Back end to build microservices , But in microservices and Serverless There is no direct path between . Others say , because Serverless The contents of Function Can be seen as smaller 、 Atomized Services , Some ideas of natural land matching with microservice , therefore Serverless And microservices are made in heaven . soon 2021 Years. ,Serverless Will it eventually replace microservices ？ From microservices to Serverless What kind of path does it take ？ This article will Serverless In depth comparison with micro services in terms of advantages and disadvantages .
conceptually , Microservices are completely in line with Serverless Functional structure , Microservices can easily implement the deployment and runtime isolation of different services . In terms of storage , image DynamoDB Such a service allows each microservice to have its own database , And expand independently . Before we go into the details , Don't be in a hurry “ little ”, Let's start with the actual situation of your team , Really think about whether it is suitable to use microservices , Never because " This is the trend " And choose it .
The tiny segment Serverless Advantages in the environment
Optional scalability and concurrency
Serverless Make it easy to manage concurrency and scalability . In the microservices architecture , We make the most of that . Each microservice can adjust the concurrency according to its own requirements / Extensibility settings . It's very valuable from different perspectives ： For example, reduce DDoS The possibility of attack , Reduce the financial risk of cloud billing out of control , Better allocation of resources ...... wait .
Fine grained resource allocation
Because scalability and concurrency are optional , Fine grained user resource allocation can be controlled . stay Lambda functions in , Every microservice can be customized according to its needs , Have different levels of memory allocation . such as , Customer oriented services can have higher memory allocation , Because it will help speed up execution time ; And internal services that are insensitive to latency , You can deploy with optimized memory settings .
This feature also applies to storage mechanisms . such as DynamoDB or Aurora Serverless The database can be based on the specific （ tiny ） Service needs , Have different levels of capacity allocation .
This is a general property of microservices , Not at all Serverless The unique properties of , This feature makes it easier to decouple components with different functions in the system .
Support multiple running environments
Serverless Configuration of functions 、 Ease of deployment and execution , Provides the possibility for systems based on multiple runtime .
With Serverless Infrastructure , You don't need to spend extra effort on the operation to work directly for the regular backend API choice Node.js, Choose... For data intensive work Python. obviously , This may bring extra work on code maintenance and team management to your team .
The independence of the development team
Different developers or teams can work on their own microservices 、 Repair bug、 Extended functions, etc , Do not interfere with each other . such as AWS SAM、Serverless Tools like frameworks make developers more independent at the operational level . and AWS CDK The emergence of Architecture , It can be done without compromising high quality and operation and maintenance standards , Let the development team have a higher degree of Independence .
The tiny segment Serverless The weakness in
Difficult to monitor and debug
stay Serverless Among the many challenges it brings , Monitoring and debugging can be the most difficult . Because computing and storage systems are scattered across many different functions and databases , Not to mention the queue 、 Caching and other services , These problems are caused by the micro service itself . however , At present, there are professional platforms to solve all these problems . that , Whether the professional development team should introduce these professional platforms should also be considered based on the cost .
Maybe more cold starts
When FaaS platform （ Such as Lambda） When you need to start a new virtual machine to run the function code , A cold start will occur . If your function Workload Sensitive to delay , You're likely to have problems . Because cold start will increase the total start-up time by hundreds of milliseconds to a few seconds , When a request is completed ,FaaS Platforms usually make microVM Free for a while , Waiting for the next request , And then in 10-60 Turn off... In minutes （ Yes , Changed a lot ）. The result is ： The more frequently your functions are executed ,microVM The more likely it is to start and run for incoming requests （ Avoid cold starts ）.
When we spread our applications across hundreds or thousands of microservices , We may spread the call time in each service , Results in a decrease in the frequency of calls to each function . Be careful " It is possible to scatter calls ". According to the business logic and the way your system behaves , The negative impact may be small , Or it can be ignored .
The concept of microservice itself has other inherent shortcomings . These are not related to Serverless There's an inner connection . For all that , Every team adopting this type of architecture should be cautious , To reduce its potential risks and costs .
- Defining service boundaries is not easy , May cause architecture problems .
- A wider range of attack
- The cost of service arrangement
- Synchronous computing and storage （ When needed ） It's not easy to achieve high performance and scalability
The tiny segment Serverless Challenges and best practices in
Serverless How big should the medium and micro services be ？
People are understanding Serverless when ,"Function as a Services（FaaS） " It's easy to confuse the concept of function statements in programming languages . at present , We are in a period where there is no way to draw a perfect line , But experience shows that , Use very small Serverless Functions are not a good idea .
When you decide to put a （ tiny ） When services are divided into independent functions , You're going to have to face Serverless problem . therefore , Here's a reminder , Whenever possible , It's much better to keep the relevant logic in a function .
Of course , A micro service decision-making process should also have its own advantages
You can imagine that ：" If I split up this micro service ......
- Does it allow different teams to work independently ？
- Can you benefit from fine-grained resource allocation or selective scalability ？
If not , This resource should be similar to what you need to consider 、 Context and perform correlation Workload Services bundled together .
Loosely coupled architecture
By forming Serverless There are many ways to coordinate microservices with functions .
When synchronous communication is needed , Can be called directly ( namely AWS Lambda RequestResponse Calling method ), But this leads to a highly coupled architecture . A better choice is to use Lambda Layers or HTTP API, In this way, the client will not be affected by the future modification or migration service .
For the accept asynchronous communication model , We have several options , Such as queue （SQS）、 Subject notification （SNS）、Event Bridge perhaps DynamoDB Streams.
Cross component isolation
Ideally , Micro services should not expose details to users . image Lambda In this way Serverless The platform will provide a API To isolate functions . But this in itself is a disclosure of implementation details , Ideally , We'll add an unknowable... To the function HTTP API layer , Make it really isolated .
The importance of using concurrency constraints and throttling strategies
In order to reduce DDoS attack , In the use of AWS API Gateway Waiting for service , Be sure to set a separate concurrency restriction and throttling strategy for each public facing terminal . This kind of service generally sets global concurrency quota for the whole region in the cloud platform . If you don't have endpoint based restrictions , The attacker only needs to target a single endpoint , You can use up your quota , And paralyze your entire system in the area .
Link to the original text
This article is the original content of Alibaba cloud , No reprint without permission .