Exclusive Interview with Dale Kim, Senior Director of Technical Solutions, Hazelcast
by Analytics Insight
May 24, 2021
In-memory computing refers to the storage of data in the main random access memory (RAM) of dedicated servers rather than in complex relational databases that run on slow disk drives. In-memory computing enables business customers, such as retailers, banks, and utilities, to easily identify patterns, analyze large amounts of data along the way, and perform operations. Falling memory prices in today’s market are a major factor in the growing success of in-memory computing. As a result, in-memory computing has become cost effective in a wide variety of applications.
Speaking to Analytics Insight, Dale Kim, Senior Director of Technical Solutions, Hazelcast, provides an overview of how Hazelcast’s in-memory computing platform meets the growing demand for improved application performance, speed and scalability.
Please tell us about the company, its specialization and the services offered by your company.
Hazelcast is an open source software company based in San Mateo, Calif. That provides a cloud-native application platform that includes in-memory processing and stream processing capabilities. The platform is used to add real-time capabilities into existing infrastructures as well as to accelerate business applications to meet strict SLAs and to promote innovation through greater experimentation. The commercial and enterprise edition of our software is licensed per node, per subscription, and includes features that simplify production deployments, such as business continuity, reduced planned downtime, scaling up and security.
With what mission and what objectives was the company created? In short, tell us about your journey since the creation of the company?
The founding of the company was driven by the need for companies to get value from their data faster. One way to achieve this was to put subsets of data in RAM and let applications reduce the bottlenecks associated with accessing data on disk. Although this architectural model has long been used in the form of caching, companies are looking for larger data sets in memory, and having sophisticated distributed in-memory technology was a much better option than per-application or application-based caches. per node. That was only part of the story, however, as companies were also looking for large-scale data processing that could spread the work across all nodes in a cluster. This was a big plus for using Hazelcast over in-memory databases which were mostly good for simple caching use cases. Hazelcast enables IT teams to easily create applications that could be deployed on multiple nodes and work together in a parallel fashion while reducing network and disk access by reading data into memory that is on the same node as each instance of it. ‘application.
The most recent innovation for Hazelcast was to extend the IT framework to enable processing of data streams. Hazelcast can read an incoming and continuous stream of data at high speed, apply a variety of operations such as transformations, filtering, aggregations and machine learning scoring, with extremely high throughput and low latency. Stream processing capabilities work in conjunction with the in-memory frame to dramatically reduce latency and enable more compute work in less time.
Tell us how your business is contributing to the nation’s IoT / AI / Big Data Analytics / Robotics / Autonomous Vehicles / Cloud Computing industry and how the business is benefiting customers.
The industries mentioned above both have two key characteristics: massive amount of data and the need to process it quickly. Hazelcast addresses the challenges associated with these data characteristics with its inherent design principles. First, Hazelcast is designed to be lightweight and standalone with no external dependencies to run, which greatly facilitates its integration into existing IT infrastructures. It also makes it easy to deploy in any environment, including IoT edge deployments away from a central data center, as most edge computing deployments tend to have limited physical space and therefore less material resources.
Second, Hazelcast includes numerous performance optimizations that take advantage of all available computing resources, which also makes it ideal for large-scale data processing environments. Since Hazelcast is cloud native and can scale elastically, a cluster can easily grow as data sets grow. Extreme performance, scalability and efficiency were recently showcased in a benchmark test where Hazelcast processed 1 billion data records per second on a data stream, with millisecond latency on just 720 vCPUs in the cloud.
Third, Hazelcast emphasizes data security from both a reliability and security perspective. Since many customers use Hazelcast for the business critical deployments that run their operations, downtime would result in significant losses. Hazelcast’s high availability and disaster recovery capabilities ensure continuity even in the event of hardware or site-wide failures. With built-in security capabilities, Hazelcast can also support environments with sensitive data and prevent unauthorized access.
How do disruptive technologies such as IoT / Big Data Analytics / AI / Machine Learning / Cloud Computing impact today’s innovation?
These disruptive technologies allow companies to become much more efficient and much smarter in the way they define their business strategies. Indeed, these technologies all aim to overcome the limitations of manual effort and thereby achieve automation, real-time responsiveness and economies of scale. Interestingly, new levels of automation result in increased investment in human resources to enable a continuing aggravating effect on the speed and efficiency of the business. For example, the agility gained through the adoption of cloud computing allows companies to focus more on innovation efforts to create more business value, rather than allocating those human resources to maintaining business lines. infrastructure.
How is your company helping customers achieve relevant business results through the adoption of the company’s technological innovations?
Real-time responsiveness and greater opportunities for innovation are just two themes that allow customers to gain a competitive advantage. A Hazelcast customer is able to aggregate data around their own interactions with their customers and, based on the full history of their customers’ interactions, the company is able to immediately identify a set of offerings. of products. By applying these recommendations based on the most recent interactions that are captured in their system as “event data”, they achieve the goal of delivering the right product at the right time. This implementation on Hazelcast led to a significant increase in offer conversions, making the initiative profitable. Another client uses Hazelcast to identify fraud in financial transactions. Certainly, the more they can prevent fraud, the more they add to their bottom line. They hypothesized that if they created multiple machine learning-based fraud algorithms and ran them simultaneously to create a composite fraud assessment score, they could get more accurate predictions of whether a transaction is. fraudulent or if an otherwise suspicious transaction is genuinely legitimate. . With their higher scoring accuracy, made possible by the performance margin Hazelcast gave them, they were able to save millions of dollars per year.
How does your company’s rich expertise help discover patterns with powerful analytics and machine learning?
The biggest challenge today with advanced analytics, especially machine learning initiatives, is bringing machine learning models to production. Limited resources, sub-optimal technologies, and misallocation of skills all contribute to this challenge. Companies need to simplify the process to achieve a higher success rate of deployments, and therefore a greater opportunity for experimentation which can lead to greater business success. Hazelcast offers significant performance improvements over traditional machine learning deployments that require a lot more infrastructure and create a lot more complexity. Performance Margin gives customers the freedom to try new things and learn with a quick feedback loop. The relative simplicity of deploying machine learning models in Hazelcast allows IT teams to plug models directly from their data scientists, including in Python language, into the data pipeline without significant manual effort. Even the “Job Upgrade” feature in Hazelcast simplifies the effort by allowing teams to replace existing machine learning models with newer versions without downtime or data loss.
Mention some of the customer awards, accomplishments, recognitions and reviews that you think are remarkable and valuable to the business.
In its early days, Hazelcast was selected as a Gartner Cool Vendor for Application and Integration Platforms for its innovation in in-memory technologies that enables customers to build applications requiring rapid access to data. Most recently, Hazelcast was selected as one of Red Herring’s Top 100 Private Technology Companies in North America for its Global 2000 market adoption and innovative technology. Hazelcast customers also received awards for their Hazelcast deployments, including The Banker’s 2020 Digital Banking Innovation Awards, in which 6 of the 15 winners partnered with Hazelcast for next-generation global payments infrastructure and other digital transformation projects.
What is the advantage of your company compared to other players in the industry?
A common problem with other technologies in the industry is that the complexity of their deployments inhibits innovation. Since so many resources are spent on infrastructure, there is an opportunity cost to gain a competitive advantage. Hazelcast solves this problem with the simplified architecture designed to integrate well with existing systems. Additionally, Hazelcast benchmarks have shown superior data processing performance and scalability, addressing a major concern of businesses facing ever-increasing workloads.