Kurt Marko is an IT industry analyst, consultant and regular contributor to a number of technology publications, pursuing his passion for communications after a varied career that has spanned virtually the entire high-tech food chain from chips to systems. Upon graduating from Stanford University with bachelor’s and master’s degrees in electrical engineering, Marko spent several years as a semiconductor device physicist, doing process design, modeling and testing.
He then joined AT&T Bell Laboratories as a memory chip designer and CAD and simulation developer. Moving to Hewlett-Packard, he started in the laser printer R&D lab doing electrophotography development, for which he earned a patent, but his love of computers eventually led him to join HP’s nascent technical IT group. Marko spent 15 years as an IT engineer and was a lead architect for several enterprisewide infrastructure projects at HP, including the Windows domain infrastructure, remote access service, Exchange email infrastructure and managed web services.
WHAT HAPPENS WHEN YOU TAKE THE CONCEPT OF “CLOUD” TO ITS LOGICAL CONCLUSION?
As organizations and developers gain familiarity with cloud services, that question is gaining urgency. The decoupling of sophisticated application and network services — databases, machine learning, load balancing, development/deployment pipelines — from infrastructure yields more profound benefits than just outsourcing CPU cycles and storage capacity. Logical abstractions go only so far, however — for most cloud services, architects must still make choices around infrastructure components and resulting configurations: How big do we size the SQL database? Do we use a GPU or memory-enhanced instance for Hadoop? Which instances and network ports to load balance? Plenty of decisions must be made before a single job is run.
But what if you could just execute custom applications in response to particular inputs without the need to size instances or even deploy anything but your code? That’s the promise of cloud serverless functions, popularized as “serverless computing” with the introduction of AWS Lambda in 2014. We prefer to call them serverless functions or functions-as-a-service (FaaS), because the implication that they magically execute code without a server is flawed.
Still, the “serverless” moniker isn’t entirely inappropriate. It arises from the fact that the user need not worry about the server, its configuration, location or availability — just the code. Indeed, serverless functions are the closest thing yet to true utility computing. Just as the electric grid delivers whatever power is needed, even when running the A/C full blast on a scorching day, serverless functions will automatically and dynamically scale capacity in response to demand.
If this seems like an impossible trick, read on. We’ll explain the mechanics and discuss where serverless functions are best used. When FaaS seems too good to be true, it’s because there are important conditions — although with a little creativity, the range of usage scenarios can be quite broad.