A new report from RT Insights describes the benefits of real-time transaction processing in banking and financial services and shows how traditional database architectures interfere with real-time data movement. In order to get the benefits of real-time transaction processing, such as improved portfolio management, fast credit card fraud and acceptance checks, and others, banks and other financial services institutions need to use a translytical database, combining the best of transaction and analytical data processing capabilities in a single, fast, scalable system.
A Real-Time Database for Banking
What is a real-time database? And why would you need one for banking and financial services companies?
A real-time database is a database that can support real-time processing. According to Wikipedia (as of the publication date), “Real-time processing means that a transaction is processed fast enough for the result to come back and be acted on right away.” That certainly sounds like something banks could use – when you go to the ATM machine, or use a credit card, or apply for a home loan, you certainly want the systems you’re using to return the right answers, right away. (These functions are also good examples of the use of machine learning in financial services, another SingleStore specialty.)
Indeed, accounting and banking are two of the areas where real-time databases are said to be most useful. The RTi report cites many important applications for “faster and more intelligent decision-making”: fraud monitoring; dynamic portfolio analysis; regulatory compliance; and protection from cyberthreats.
What Kind of Database Can Be Real-Time?
Traditional data processing depends on databases that seem designed, not to enable, but to keep data from being real-time. These databases are not scalable, so they’re limited to the capabilities of a single machine. In order to make the most of what one machine can do, transactions are handled on specific database type, called online transaction processing (OLTP).
Then, the OLTP system is tied up for a while so a specialized process, extract, transform, and load (ETL), can copy data off it. The data is then remixed with other in-house and outside data, reformatted for faster analytics performance, and moved to a different kind of database for analytics, broadly called online analytics processing (OLAP).
Different kinds of analytics databases exist; data warehouses have specialized tools for slicing and dicing data, while operational analytics databases are better suited for supporting applications, such as a mapping or ride hailing app on your phone. Even NoSQL gets in the act, with data lakes used for data science queries and even for business intelligence (BI) tools, though the fit there is not very strong.
The movement of data from ingest, to OLTP, through ETL, to OLAP can take many hours and even days – far from real-time. So the RTI report puts forward translytical databases, which combine transactional and analytical capabilities, as the right place to look for an answer.
A translytical database is both fast and scalable – when a workload is too much for a single server to support, a second server can be added, extending the processing power, RAM, and disk space available for the stored data. By combining both functions into a single database, eliminating the intermediate steps inherent in the OLTP/ETL/OLAP split, the translytical database can serve as a real-time database, supporting crucial applications in banking and financial services.
Additional Benefits of a Real-Time Database for Financial Services
Organizations are so used to traditional, siloed data structures that they don’t see some of the hidden costs involved – costs that are removed when slow-moving data becomes real-time data. Here are some of the benefits that banks and other financial services organizations receive when they move to real-time transaction processing: