Knect365 is part of the Knowledge and Networking Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 3099067.

Informa

Serverless Functions: Start Now, Choose Projects Wisely

Kurt Marko is an IT industry analyst, consultant and regular contributor to a number of technology publications, pursuing his passion for communications after a varied career that has spanned virtually the entire high-tech food  chain from chips to systems. Upon graduating from Stanford University  with bachelor’s and master’s degrees in electrical engineering, Marko spent several years as a semiconductor device physicist, doing process design, modeling and testing.

About the author

He then joined AT&T Bell Laboratories as a memory chip designer and CAD and simulation developer. Moving to Hewlett-Packard, he started in the laser printer R&D lab doing electrophotography  development, for which he earned a patent, but his love of computers eventually led him to join HP’s nascent technical IT group. Marko spent 15  years as an IT engineer and was a lead architect for several enterprisewide infrastructure projects at HP, including the Windows domain infrastructure, remote access service, Exchange email infrastructure and managed web services.

WHAT HAPPENS WHEN YOU TAKE THE CONCEPT OF “CLOUD” TO ITS LOGICAL CONCLUSION?

Screen Shot 2017-08-10 at 15.31.52

As organizations and developers gain familiarity with cloud services, that question is gaining urgency. The decoupling of sophisticated application and network services — databases, machine learning, load balancing, development/deployment pipelines — from infrastructure yields more profound benefits than just outsourcing CPU cycles and storage capacity. Logical abstractions go only so far, however — for most cloud services, architects must still make choices around infrastructure components and resulting configurations: How big do we size the SQL database? Do we use a GPU or memory-enhanced instance for Hadoop? Which instances and network ports to load balance? Plenty of decisions must be made before a single job is run.

But what if you could just execute custom applications in response to particular inputs without the need to size instances or even deploy anything but your code? That’s the promise of cloud serverless functions, popularized as “serverless computing” with the introduction of AWS Lambda in 2014. We prefer to call them serverless functions or functions-as-a-service (FaaS), because the implication that they magically execute code without a server is flawed.

Screen Shot 2017-08-10 at 16.01.14

Still, the “serverless” moniker isn’t entirely inappropriate. It arises from the fact that the user need not worry about the server, its configuration, location or availability — just the code. Indeed, serverless functions are the closest thing yet to true utility computing. Just as the electric grid delivers whatever power is needed, even when running the A/C full blast on a scorching day, serverless functions will automatically and dynamically scale capacity in response to demand.

If this seems like an impossible trick, read on. We’ll explain the mechanics and discuss where serverless functions are best used. When FaaS seems too good to be true, it’s because there are important conditions — although with a little creativity, the range of usage scenarios can be quite broad.

Download full report

Get articles like this by email