You are viewing content from a past/completed QCon

Presentation: How In-Memory Computing Enables a New Generation of Microservices

Track: SPONSORED SOLUTIONS TRACK I

Location: Marina

Duration: 1:40pm - 2:30pm

Day of week: Monday

Share this on:

Abstract

The first generation of microservices was envisioned as stateless request-response endpoints. But it's now clear that microservices must often maintain some state. For example, microservices tasked with running machine learning models or engaged in statistical classification must maintain the state of their models and their parameter weights. This brings us to one of the biggest challenges—where is that state stored? Options like RDBMSs are too slow, do not scale, and have inflexible schema models. Distributed in-memory caching, however, is the only widely adopted enterprise technology that offers high speed, scalability, and dynamic schema evolution.

In this talk, I will discuss:

  • Why today’s business solutions need a next-generation microservices architecture.
  • Why microservices need to leverage in-memory computing technologies.
  • How you can get started with next-generation microservices.

Speaker: Lucas Beeler

Senior Solutions Architect @hazelcast

Find Lucas Beeler at

Last Year's Tracks

Monday, 11 November

Tuesday, 12 November

Wednesday, 13 November