Exclusive interview with Dale Kim, Senior Director of Technical Solutions at Hazelcast

0



Exclusive interview with Dale Kim, Senior Director of Technical Solutions at Hazelcast


from Analytics Insight
May 24, 2021

In-memory computing refers to the storage of data in the main memory (RAM) of dedicated servers, rather than in complex relational databases running on slow drives. With in-memory computing, business customers such as retailers, banks, and utilities can easily identify patterns, analyze large amounts of data on the fly, and complete operations. The current decline in memory prices in the marketplace is a major contributor to the growing success of in-memory computing. As a result, in-memory computing has become inexpensive in a wide variety of applications.

Speaking to Analytics Insight, Hazelcast senior director of technical solutions, Dale Kim, shares an insight into how Hazelcast’s in-memory computing platform is meeting the growing demand for improved application performance, speed, and scalability.

Please inform us about the company, its specialization and the services your company offers.

Hazelcast is a San Mateo, CA-based open source software company that provides a cloud-native application platform that includes in-memory computing and stream processing capabilities. The platform is used to add real-time capabilities to existing infrastructures as well as accelerate business applications to meet stringent SLAs and drive innovation through more experimentation. The commercial enterprise edition of our software is licensed on a per-node subscription basis and includes features that simplify production deployments, such as: B. Business continuity, reduced planned downtime, higher scalability and security.

With what mission and what goals was the company founded? In short, are you telling us about your journey since the company was founded?

The inception of the company was driven by the need for companies to get more value from their data, faster. One way to achieve this goal was to put subsets of data in memory and let applications reduce the bottlenecks associated with disk-based data access. While this architectural pattern has long been used in the form of caching, organizations are looking for larger sets of data in memory, and sophisticated, distributed in-memory technology has been a far better option than per-application or per-node caches. That was only part of the story, however, as organizations also looked for large-scale computing that could spread the work across all of the nodes in a cluster. This was a huge advantage for using Hazelcast over in-memory databases, which are mostly good for simple caching use cases. With Hazelcast, IT teams can easily create applications that can be deployed on multiple nodes and work together in parallel. At the same time, network and hard disk access is reduced by reading in-memory data that is on the same node as each application instance.

The most recent innovation for Hazelcast was the extension of the computer framework to enable the processing of data streams. Hazelcast can read an incoming, continuous feed of high-speed data, apply a variety of operations such as transformations, filtering, aggregations, and scoring for extremely high throughput, low latency machine learning. The stream processing functions work with the in-memory framework to significantly reduce latency and enable more computational work in less time.

Tell us how your company is contributing to the country’s IoT / AI / big data analytics / robotics / self-driving vehicle / cloud computing industry and how the company is benefiting customers.

The above industries all have two key characteristics – massive amounts of data and the need to process them quickly. Hazelcast addresses the challenges associated with these data properties with its inherent design principles. First, Hazelcast is designed to be lightweight and standalone without the need to run external dependencies, which greatly simplifies integration with existing IT infrastructures. This also makes it easier to deploy in any environment, including IoT deployments on the edge of a central data center, as most edge computing deployments typically have limited physical storage space and therefore fewer hardware resources.

Second, Hazelcast includes many performance optimizations that take advantage of all available computing resources, making it ideal for large computing environments as well. Because Hazelcast is cloud-native and elastically scalable, a cluster can easily be expanded as data sets grow. The extreme performance, scalability and efficiency was recently demonstrated in a benchmark test in which Hazelcast processed 1 billion data records per second in a data stream with a latency of milliseconds on only 720 virtual CPUs in the cloud.

Third, Hazelcast places a high value on data security from both a reliability and a security perspective. Because many customers use Hazelcast for mission-critical deployments where their operations run, downtime would result in significant losses. High availability and disaster recovery capabilities in Hazelcast ensure continuity is maintained even in the event of hardware or site-wide failures. With built-in security functions, Hazelcast can also support environments with sensitive data and prevent unauthorized access.

How do disruptive technologies such as IoT / big data analysis / AI / machine learning / cloud computing affect today’s innovation?

These disruptive technologies enable companies to define their business strategies more efficiently and intelligently. This is because these technologies are about overcoming the limits of manual effort and thus realizing automation, real-time responsiveness and economies of scale. Interestingly, the new levels of automation lead to higher investments in human resources in order to have a continuous, increased effect on business speed and efficiency. For example, the agility gained through the adoption of cloud computing enables companies to focus more on innovative efforts to create more business value, rather than dedicating those human resources to maintaining infrastructure.

How does your company help customers achieve relevant business results by adopting the company’s technological innovations?

Real-time responsiveness and greater opportunities for innovation are just two things that customers can use to gain a competitive advantage. A Hazelcast customer is able to aggregate data about their own customer interactions, and based on their customers’ entire interaction history, the company can instantly identify a range of product offerings. By steering these recommendations based on the latest interactions recorded in their system as “event data”, they achieve the goal of offering the right product at the right time. This implementation on Hazelcast resulted in a significant increase in bid conversions, making the initiative profitable. Another customer uses Hazelcast to identify fraud in financial transactions. The more fraud you can prevent, the more you contribute to your bottom line. They theorized that they could get more accurate predictions about whether a transaction is fraudulent or whether an otherwise suspicious-looking transaction is actually legitimate by building multiple machine learning-based fraud algorithms and running them concurrently to create a composite fraud score score . With their higher point accuracy made possible by the performance margin provided by Hazelcast, they have saved millions of dollars annually.

How does your company’s extensive know-how help uncover patterns with powerful analytics and machine learning?

The biggest challenge with advanced analytics, especially machine learning initiatives, today is getting the machine learning models into production. Limited resources, sub-optimal technology, and poor skill allocation add to this challenge. Organizations need to simplify the process to achieve a higher deployment success rate, and therefore more experimentation that can lead to greater business success. Hazelcast offers significant performance improvements over traditional machine learning deployments, which require much more infrastructure and create much more complexity. The leeway gives customers the freedom to try new things and learn with a quick feedback loop. The relative simplicity of deploying machine learning models in Hazelcast is that IT teams can add models to the data pipeline directly from their data scientists, including in the Python language, without significant manual effort. Even the Job Upgrade feature in Hazelcast simplifies the effort by allowing teams to replace existing machine learning models with newer versions without downtime or loss of data.

Mention some of the awards, achievements, recognitions, and customer feedback that you find noteworthy and valuable to the company.

In its early days, Hazelcast was selected as a Gartner Cool Vendor for application and integration platforms for its innovation in in-memory technologies that enable customers to build applications that require rapid access to data. More recently, Hazelcast was selected as the Red Herring Top 100 North America Private Technology Company for its 2000 global launch and innovative technology. Hazelcast customers have also received awards for their Hazelcast deployments, most notably The Banker’s Innovation in Digital Banking 2020 Awards, where 6 of the 15 award winners have partnered with Hazelcast for next-generation global payment infrastructures and other digital transformation projects.

What is the advantage of your company over other players in the industry?

A common problem with other technologies in the industry is that the complexity of their deployments inhibits innovation. Because so many resources are devoted to infrastructure, there is an opportunity cost to gain a competitive advantage. Hazelcast addresses this problem with the simplified architecture that integrates well with existing systems. In addition, Hazelcast benchmarks have shown superior computing performance and scalability that help solve a huge problem for businesses facing ever-increasing workloads.



Share.

Comments are closed.