In-Memory Meets Big Data: How Much Speed Do You Need?

In-Memory Meets Big Data: How Much Speed Do You Need?

 

Available On Demand
Duration 60min
Speakers
Doug Henschen
Executive Editor
InformationWeek
Doug Henschen
Doug Henschen is Executive Editor of InformationWeek, where he covers the intersection of enterprise applications with information management, business intelligence, big data and analytics. He previously served as editor in chief of Intelligent Enterprise, editor in chief of Transform Magazine, and Executive Editor at DM News. He has covered IT and data-driven marketing for more than 15 years.
John Schitka
Solution Marketing Manager
SAP
John Schitka
John Schitka, is a Solution Marketing Manager on the SAP Big Data Solution Marketing team. His focus in the SAP Big Data arena is largely on Hadoop and SAP HANA smart data access capabilities. A graduate of McMaster University, he holds an MBA from the University of Windsor. He has worked in product marketing and product management in the high tech arena for a number of years, taught at a private college and has co-authored a number of published text books. He has a true love of technology and all that it has to offer the world.
Wendy Schuchart
Editor
InformationWeek
Wendy Schuchart
Wendy Schuchart is a technology journalist with more than a decade experience in enterprise IT. Most recently, Schuchart was the senior site editor of TechTarget's CIO Media Group. She has also served as section editor for UBM's Network Computing and Secure Enterprise. Connect with her on Twitter @wendyschuchart.

Databases and big-data platforms that can take advantage of lots of memory really do deliver the fastest data-analysis speeds available. That’s enticing to companies struggling with high-scale online transactions or in need of timely analysis of customer behavior. For now, disk-based storage is still the enterprise standard, but the price of RAM is declining steadily. The ongoing question for practitioners will be, “how much speed do you need?”

Attend this webinar to get the following questions answered:

  • What qualifies as big data, and what are the options for in-memory data analysis?
  • How are memory standards changing in the data warehousing arena?
  • How are NoSQL and NewSQL vendors exploiting memory?
  • What qualifies as an in-memory database, and how are they being used?
  • How will in-memory technology play in the world of Hadoop?

In the big data arena, high-scale NoSQL and NewSQL database vendors are increasingly exploiting RAM, and at the extremes, in-memory databases are tackling real-time offers and gaming. In the world of Hadoop, the Spark framework has emerged for memory-intensive machine learning, modeling work, and even streaming data analysis. Register for this webinar and learn when these in-memory options may make sense for you.

Already a member? Login