Unleashing IoT’s true potential – the power of in-memory analytics databases
We’ve all heard the industry chatter around IoT and how it can enable businesses to unlock and deliver commercial success. By extracting real-time data from connected endpoints, IoT helps businesses to make more informed and intelligent decisions that can directly impact their bottom line.
However, far too often, these organizations are getting swept up by this promise of what IoT data could deliver for them, without actually considering what the data is being collected for. Data for data’s sake isn’t a valuable business resource. So, it’s essential to consider the true value of IoT.
Endless possibilities are brought about by extracting real-time data and using it to empower an organization’s workforce to inform better and quicker decision-making. Against that backdrop, the collection of data is easily the smallest hurdle to overcome. It’s at the next step – funneling it into an analytics platform for analysis and insights gathering – where it starts to get interesting and where its true value can start to be felt.
The faster the in-memory system, the better
Like sand inside an hourglass, when a large amount of real-time data is fed into a slow analytics platform – as is the case within many organisations – it inevitably bottlenecks, slowing down and devaluing the data. Given time, some useful business insights and patterns may be found via these slow analytics platforms, or maybe even through manual data science. However, information that was important yesterday may have lost its relevance. If organisations want to harness the real value offered by IoT they need to use an in-memory analytics database that is capable of processing the data at much faster speeds.
In-memory itself refers to using a computer’s random-access memory (RAM) instead of it’s hard disk drive or flash memory storage. It runs at much faster speeds than a traditional hard disk drive and is the perfect solution for processing large volumes of IoT data in real-time. When in-memory technology is applied to analytics databases it allows for multiple users to interact with the data, while also ensuring that the data doesn’t bottleneck.
A hybrid cloud infrastructure can take the speed, flexibility and scalability offered by in-memory systems to the next level. It can adequately prepare businesses for the big increases in data capacity from IoT adoption and can also duplicate data in real-time when managing migrations to new cloud-based applications, without causing any major disruptions or loss of service.
How to acquire the best in-memory system for your data
With so many in-memory options to choose from, finding the right system for a particular organisational need can be overwhelming. Before going any further it’s crucial that the business applications that need to benefit from better data analytics performance are identified, in order to ensure that the new technology is providing better service, better insights and optimising operations.
Next, clear criteria should be established, setting out a complete set of benchmark requirements. At this point it is also a good idea to put together a solid cross-functional team who will be responsible for conducting the research once the new system has been implemented.
Once these clear goals have been established organisations should consider the following five points when exploring in-memory analytic databases:
- General system architecture: It’s important to remember that no two analytics databases are the same and achieving the best analytics performance possible can only be guaranteed through tight integration as opposed to just adding a cache. By taking this integrated in-memory approach, larger, more complex analytics workloads can be run by an unlimited number of database users, in addition to using the database for a wide variety of other tasks.
- Costs and scalability: There are some vital factors to take into consideration when preparing to implement an in-memory analytics database, for example it is always important to consider the associated licensing and software acquisition costs, which will be on top of any hardware investment needed. In addition, enquire if the database is a scalable massively parallel (MMP) system, and that additional servers can be added with ease as they are needed. If the system is not an MMP make sure your budget reflects the extra costs involved.
- Integration: An important step in selecting an appropriate in-memory database is investigating whether the solution is mature enough to handle complex analytic workloads. For example, can the system support commonly-used drivers and interfaces? Does it integrate with the most widely-used Extract, Transform, Load (ETL) and business intelligence (BI) tools? It’s vital that as analytic ecosystems adapt over time, the in-memory database remains compatible in the future.
- Vendor maturity and customer references: Finding the right in-memory solution is only the first step; organisations must also ensure that they can depend on their database vendor and its customer community to provide continued support. Speaking with existing customers is a great way of establishing how much future support the vendor offers. This also provides the opportunity to gain insight into the real-world advantages one particular solution offers over another within businesses that are already using it, as well as of course any negatives they have come across. If this is not a possibility, it’s worth asking the vendor if they can provide a customer case study.
- Simplicity: This one may seem obvious, but as a final step before committing, find out how easy your chosen solution is to install and operate; for example, does it require an army of database administers to keep it running? Remember, the more automated the system, the less hoops an organisation will have to jump through to see solid business value from its BI and analytics projects.
Knowing what you want from your in-memory database is key to success
There’s no doubt that using IoT can provide real organisational value. The whole point of it is to be able to make quick, informed decisions based on real-time insights. Businesses today can’t afford to be waiting on queries if they want to realise the full value and potential of IoT and stay ahead of competitors.
In-memory databases are the solution. They allow businesses to simplify their existing IT infrastructure and process larger IoT data workloads with far less hardware resources. By offloading traditionally troublesome processes and applications in favour of an in-memory system, organisations can avoid costly legacy system upgrades, traditional databases and hardware appliances. Rather, they can focus on running complicated analyses in near real-time to find actionable value from IoT data, which will impact everything from profitability to increased productivity within the business.
This article was written by Mathias Golombek, the CTO of Exasol. Mathias joined Exasol back in 2004 as software developer, led the database optimization team and became a member of the executive board in 2013. Although he is primarily responsible for the Exasol technology, his most important role is to build a great environment where smart people enjoy building such an exciting product. He is never satisfied with 90% solutions and love the simplicity of products. His goal is to encourage responsibility and a company culture that people love to be a part of.