Cyber Security Roundup: How SingleStore Customers Power the World’s Cyber Security
Trending

Cyber Security Roundup: How SingleStore Customers Power the World’s Cyber Security

SingleStore is proud to count our customers as many of the world’s most innovative cyber security companies. Here’s a roundup of some of these industry leaders’ latest news and developments. We’ve also included highlights from our recent webinar with cyber solutions providers Twingo and Armis Security.The war in Ukraine brings heightened cyber riskAkamai recently released its findings on how geopolitical tensions increase the risk of distributed denial of service (DDoS) attacks and other damaging intrusions. It noted that, days after the Russian invasion, “Ukraine has been bombarded with DDoS assaults aimed at taking down government sites, communication providers, and financial institutions.”Akamai noted a specific surge in European DDoS activity as tensions escalated, as reflected in its data since fall 2021. Total attacks in Europe, Middle East and Africa (EMEA) are up 220% over the average of the previous four years. The cybersecurity provider even had to execute an emergency DDoS protection for a new customer in Eastern Europe, impacted by increased DDoS attacker activity across the region.This isn’t the first time Akamai has observed heavy DDoS activity in the region. The blog notes “[t]his type of activity is reminiscent of previous DDoS attacks associated with prior Russian disputes and military activities (Estonia 2007, Georgia 2008, Crimea 2014). It is notable that this is not the first time Ukrainian websites have been targeted with alleged Russian DDoS attacks.”Palo Alto Networks reminded its customers of best practices to protect against Russia-Ukraine cyber activity. In particular, the company noted “[i]t’s very common that newsworthy events are leveraged by threat actors as topics and lures in phishing and spear-phishing attacks. Leading up to the military action commencing in Ukraine, Unit 42 [Palo Alto Networks’ threat intelligence group] saw spear-phishing attacks against Ukraine organizations to deliver malware.”For example, on February 1, 2022, Unit 42 observed an attack targeting an energy organization in Ukraine. CERT-UA publicly attributed the attack to a threat group they track as UAC-0056. The targeted attack involved a spear phishing email sent to an employee of the organization, which used a social engineering theme that suggested the individual had committed a crime.The email had a Word document attached that contained a malicious JavaScript file that would download and install a payload known as SaintBot (a downloader) and OutSteel, a simple document stealer that searches for potentially sensitive documents based on their file type and uploads the files to a remote server.You can check out Palo Alto Networks’ blog to learn more about cyber best practices that are particularly relevant today.How Armis Security prevented points of vulnerability in smart devices Armis Security zeroed in on vulnerabilities related to smart manufacturing and digital  operational technology (OT) transformation, calling for customers to update their security practices. Armis says that digital OT and smart equipment are common targets for cyber attacks by organized criminals and state-sponsored attackers, who are well-aware that smart devices can often act as points of vulnerability to compromise.For example, in 2020 Armis uncovered 11 zero-day vulnerabilities in the VxWorks real-time operating system that left devices open to remote code execution, data leaks, denial of service and firewall bypass for access to the wider network. VxWorks steward WindRiver released an October 2020 update to patch these vulnerabilities, but as of  December 2020, 97% of the affected devices remained unpatched.Clearly, geopolitical unrest heightens cybersecurity risk manufacturers of all kinds. You can read more about the Armis deviceless and passive cyber security monitoring platform, and best practices for digital OT and industrial internet of things (IioT) on the Armis Security blog.Customer story: Armis saves 70% on data pipeline cost with SingleStore SingleStoreDB Self-Managed powers more cyber security leadersNucleus Security puts SingleStoreDB Self-Managed at the heart of its vulnerability management (VM) solution, an all-in-one data aggregation and process automation platform for network, cloud and application security. The company needed an underlying database that was truly fast and scalable to power their platform — and as they expanded into the private sector, their existing database became a bottleneck in supporting real-time security needs.Today,  Nucleus automates VM processes and workflows, enabling organizations to mitigate vulnerabilities 20 times faster than existing methods.Watch the webinar: Nucleus Security, Every Millisecond Counts for Cybersecurity Twingo and Armis: Streamlining cybersecurity solutions through database consolidationSingleStore recently hosted a webinar with two Israeli firms, Twingo, a Big Data consultancy, and Armis Security. Leaders from these firms discussed how SingleStoreDB Self-Managed has enabled them to deliver superior cyber security solutions through database consolidation.The Armis Unified Visibility & Security Platform, powered by SingleStoreDB Self-Managed, processes 100 billion events per day (traffic, asset, user data and more) for its global customer base, with 30TB data sets in its largest customer environments. This creates a full, dynamic picture on all of client assets, accessible within the product by free queries on devices, IP session data, predefined metrics and more, delivering 1.5-second query speed across three days’ worth of data.Twingo represents, sells and deploys leading big data technologies. Experts in architectural design, Twingo helped Armis choose the right technology and provides the optimal big data solutions for complex problems. Twingo contributed to the POC for the SingleStoreDB Self-Managed deployment at Armis, helping design the data cluster sizing, redesign queries, and optimize the model, then define and run the POC. In production Armis now has 32 managed SingleStore units, and each unit consists of 8 CPU cores, 64GB RAM, and a 2TB SSD.Here are some highlights from our conversation with Ilya Gulman, Twingo’s chief technology officer. Ilya’s comments below are translated from Hebrew.In the beginning“Twenty years ago, all we had was two types of major databases: Oracle and SQL. In the twenty years since then, many different database technologies popped up. The database world became overpopulated — each database handles specific solutions.”“Hiring a database person today is complicated since the person must know a lot. The most important part of our job is requirements. Each database has its own specialty. Another issue is how many times and places we need to save data, and we will get there shortly. A key approach would be to know how to approach the data. Are we going to approach it as a document store or just text search? And if it is a combination of all, then we need a Swiss Army knife approach. Is the data updatable? Quite important since quite a few databases are not updatable.”The problem with modern architectures“I used Amazon’s [architecture]. I would like to do a guided read. We have events occurring in the database on the DB, and write them to Kafka. This is a very important step in the process. There are islands of information within the process. The more islands you have, the more complex the situation becomes.”Ilya describes four problems with the architecture. “The first problem is hiring people well-versed in multiple technologies. The main pitfalls we see with the multiple technologies is if people don’t have the expertise, we’ll run into problems.”“The second issue is the fact that we have multiple datastores, and there is always some sort of discrepancy between data stores — or rather, inconsistencies. When those systems work for years and years, they tend to lose information.”“The third problem is the lack of ‘self healing.’ This is something most databases have on their own, and there’s a feature built in that allows it to return to ‘base.’ For example if it loses partitioning, it will attempt to find the partitioning. While these databases (like Redshift and Athena) have this feature, when you connect the databases together, the architecture doesn’t have self-healing out of the box.”“You can overcome this by ignoring self-healing and working with what you have. Or, some organizations build their own self-healing protocol — which is an incredible amount of work since you have to identify what breaks, and should be fixed.”“The last problem we encounter with this architecture is because there are so many connectors, we run into needing to join data stores. You might have a use case (Elastic) that needs a text search, followed by analytical joins (which are done in Amazon Redshift). And we want to take both and connect — do both A & B. But this really isn’t easy. If you want to do this online, it’s nearly impossible.”Can we simplify? “If you look at the architecture, our goal is switching out what we currently have (Elastic, Redshift, Redis), and in its place swap one database that did everything the others did. It can do analytics, text search, key-value store, and on and on and on.” “And because it’s all in one place, you don’t need to worry about multiple joins, you don’t need different skills, and ultimately the system has a self-healing process and will eliminate inconsistencies.”“If your use case supports it, the alternative to [several databases] is to do a consolidation. I’d like to say two things regarding SingleStore:“We can use the SingleStore design to scale out, expand the number of servers, add memory as needed, etc. SingleStore allows both analytical work as well as processing work (as shown in slide image) in a single engine.“It’s also multi-model — you can work in a relational database, semi-structured database, index, full-text search, JSON, etc. SingleStore can keep data in memory for faster processing.”In conclusion“Ultimately, there are two ways you can go. The first is specialization. While specialized systems are good in their specific ways and solving specific problems, you’ll run into the pitfalls described earlier. There are also a lot of database combinations that have to happen together.” “The second way is consolidation. This is much easier, but it’s important to note the special, niche functionalities could be missing (Ilya uses a text search example). To learn more about how SingleStoreDB Self-Managed powerstoday’s cyber security leaders, watch the webinar (in Hebrew). Follow @SingleStoreDB on Twitter to keep up with all of our latest news. And try SingleStoreDB Self-Managed free.
Read Post
Three Common MySQL Errors
Trending

Three Common MySQL Errors

MySQL has tens of thousands of users leveraging it for complex technical challenges, but single-box database systems still reach their limits when pushed. In this blog post, we summarize three of the most common errors users see on MySQL and some tips and tricks on how to approach them, as well as what to look for when seeking a more scalable alternative. Feeling like you’ve maxed out MySQL? Let’s dive in and learn more. Error #1: Too Many Connections When building applications, you may find that as you adopt consumers you are starting to hit limits with the amount of connections allowed. The default in MySQL is 151 connections, but you should be careful to increase this too high as MySQL uses one thread per connection and having too many active threads may hurt your performance. Each thread needs memory, and memory is expensive.Many users have tried to address this challenge by adopting flavors such as AWS RDS MySQL still face issues with increase loads, particularly with table/row level locking increasing the number of client connectionsHere at SingleStore, many users leverage our cloud DBaaS to handle highly concurrent workloads. Its MySQL wire protocol compatibility makes it easy to migrate your workloads. Connections are also extremely efficient for SingleStore, as the engine is smart when mapping connections to threads. If connections are idle, we immediately allocate threads to other work instead of hoarding them. This allows SingleStore to scale up to handling more connections than threads in most cases.SingleStore’s default value for max_connections is 100,000 By comparison, AWS RDS hard limit is 16,000.Google Cloud SQL for MySQL limit is 4,000Azure Database for MySQL limit is 20,000 Error #2: Out of Memory Running out of memory in MySQL could mean a few different things. The easiest place to start is to make sure tables have not exceeded the allocated memory (especially for temporary tables). This can also often be caused by highly concurrent workloads over large queries that require lots of memory (GROUP BY, for instance).The InnoDB engine requires careful tuning when MySQL becomes resource constrainedA common approach after exhausting all options of a single box VM’s memory has been something like MariaDB’s columnstore architecture. However, columnstore architectures are typically built largely for analytical purposes and are not so great at handling real-time ingest or transactional queries.SingleStore’s Universal Storage allows users to get the TCO benefits of a columnstore database, with the performance and usability of memory-constrained rowstores like fast seeks and updates.Universal Storage allows applications to run operational transactions  on data that cannot be stored in RAM at an affordable cost. Hash indexes and UPSERT ability on columnstore tables give users the ability to leverage disk for OLTP, HTAP and analytics all in one place Error #3: The Table is Full MySQL users often come across this issue when they run out of memory or disk spaceMemory-related concerns typically arise because users have not allocated enough memory to handle both query processing as well as data storage.Disk-related issues are a bit easier to resolve given the lower TCO, however adding more disk may ultimately lead to a disproportionate amount of storage as it relates to the rest of the resources on the single MySQL VM. The InnoDB engine helps by compressing data by around 50%.SingleStore’s Universal Storage (i.e., columnstore) offers 75-90% compression, helping dramatically reduce the amount of disk required to hold your data. However, simply compressing data is not enough. For columnstore to provide best in class performance, the engine must leverage seekable encodings and vectorized execution. SingleStore does both. Conclusion SingleStore offers a distributed, scalable alternative to MySQL. Universal Storage helps thousands of users support highly concurrent workloads with high compression rates.SingleStore offers many more unique differentiators such as Pipelines to rapidly ingest data from Kafka, S3, GCS, etc. -- with just a single SQL query!Try SingleStoreDB Cloud today with the help of one of our cloud engineers. Click here to access your \$500 in free credits!
Read Post
Forrester
SingleStore Recognized In

The Forrester WaveTM

Translytical Data
Platforms Q4 2022

Carbon, Cloud, and the Modern Data Estate
Trending

Carbon, Cloud, and the Modern Data Estate

On this National Cut your Energy Costs Day, it’s a good time to think about our carbon footprints at home and at work as data professionals. Since the first home was electrified in Manhattan in the 1880s, our home electricity usage has grown dramatically. According to Energy.gov, residential homes now account for 22% of all electricity consumption in the U.S. Roughly 63% of this electricity is still generated by nonrenewable fossil fuel sources in the U.S. according to the Energy Information Administration., but this varies a lot based on where you live in the country. In Georgia where I live, nonrenewable fossil fuel sources account for about 71% of electric generation.  No matter where you live, saving energy brings immediate benefits to you and helps reduce our carbon footprint. As today is National Cut your Energy Costs Day, it’s a good time to think about how changing some habits will save money on your monthly electricity bill, but the larger collective impact of cutting your energy use helps the environment by reducing the carbon footprint. Three energy-saving tips that can make a difference.  First,  install a programmable thermostat. These can learn household behaviors and set temperatures at the optimal levels for comfort and may save as much as 15 percent of electricity consumption.  Second, finish replacing those energy-hungry incandescents with LED bulbs. Finally, unplug the multiple devices, laptops, televisions, and even the coffee pot.  The bricks and wall warts for those electronics and appliances are energy vampires which draw power even when the device is off and can account for as much as 20% of your monthly bill. But as important as our personal energy habits are, perhaps we should be more environmentally conscious about the impact of our choices as IT and data professionals. Our home and work lives have blurred in the restricted lifestyle this pandemic has caused and our new home-bound behaviors are driving the largest, fastest adoption of digital services the world has ever seen. By day, we’ve turned to video conferencing from the kitchen counter for work. By night, we’re watching The Queen’s Gambit on Netflix and the Mandalorian on Disney+. (I highly recommend both.) But you may be wondering how the use of these digital services are impacting electricity usage. Electricity and the Cloud In the early months of the pandemic after air travel dropped precipitously, the carbon footprint of video streaming services received a lot of attention. With the energy consumption of information and communication technologies (ICT) increasing 9% every year, The Shift Project’s Lean ICT Report found that the carbon footprint from ICT sectors increased from 2.5% to 3.7% of global emissions, compared to air travel’s 2.4% share. Of the 3.7%, 19% come from data centers and 16% from network operations. Of course, video streaming services are just one type of digital service among many more SaaS applications delivered by both public cloud data centers and enterprise-owned data centers. and they require massive amounts of electricity to operate.  The National Resources Defense Council (NRDC) estimated that data center electricity consumption would increase to roughly 140 billion kilowatt-hours annually in 2020, the equivalent annual output of 50 power plants, costing American businesses $13 billion in electricity bills and emitting nearly 100 million metric tons of carbon pollution per year. This is roughly 2-3% of all electricity usage in the U.S. per year. Although it’s invisible to us, our collective use of digital services is making a big impact on electricity usage and the environment. There is some good news on data centers. Efficiency improvements have reduced the growth rate in their electricity consumption over the last 20 years. A study commissioned by the Department of Energy and conducted by the Lawrence Berkeley National Laboratory used decades worth of data the observe the trend in electricity usage of data centers and found that from 2000 to 2020 the rate of increase in electricity usage was estimated to stabilize close to 4% from 2010 to 2020 compared to a 90% increase from 2000 to 2005 and a 24% from 2005 to 2010. Part of the efficiency gain is attributed to the reduced growth in the number of servers operating in public cloud data centers. Servers in the public cloud are operated at a higher server utilization rate than enterprise-managed data centers. Amazon Web Services commissioned a study from 451 Research showing that their infrastructure-as-a-service was more than 3.6 times as energy efficient as the median of surveyed U.S. enterprise-owned data centers. They attribute that efficiency advantage to a much higher server utilization and more efficient hardware. Google and Microsoft Azure data centers are achieving similar efficiency gains over corporate-owned data centers. Managing the Data Estate But just as our personal energy habits in our homes have a large effect on energy use, so do our IT decisions. How we manage the data powering these SaaS applications in the context of carbon may be the next big challenge because where data is stored, where it’s copied, how it’s processed, and where it’s transmitted all add up and have an impact. You or your Cloud Operations team sees the effect of that for your company’s SaaS product in your cloud utility bill every month. Some of the line items may pop out, like a large cluster of m5.12xlarge instances in a test environment that’s been left running for 30 days with no activity. In this case, the cloud-saving habit is no different than your home energy-saving habit: Turn off the lights when you leave the room! The carbon impact of other cloud decisions we make may be less obvious. Modern customer and business experiences delivered by SaaS applications depend on a diverse data backend. Microsoft refers to this as the “modern data estate” with data stored in different data locations across different types of data stores from operational databases to data warehouses to data lakes. Into this data estate flows a deluge of data from an increasing number of different sources. Within the data estate we ingest, manage, and analyze this data using various types of storage appropriate to the processing and need for freshness. In the data estate, you need to retain a long history of data to be able to access past and present data and predict the future. I think the analogy of the estate is a useful one for thinking about the carbon impact of our data management decisions. Within the estate we have assets and liabilities, in terms of data assets and workloads. The data liabilities include the cost of copying and moving data. It has been conventional wisdom of late to pick a datastore-per-workload. There are complex decision trees available on how to pick from among almost 20 different specialty datastores such as time-series, key-value, document, wide-column, unstructured text, graph or relational data. There’s also the choice about the type of processing needed in terms of transactions or analytics. Consider the real-time data assets and workloads needed for your SaaS application or business. Think about how many different types of datastores are involved in creating, storing and processing those data assets and workloads. Also consider the machine learning models which operate on that data. The tally may be 3, 4 or more. Because it’s as easy and convenient to spin up new datastores as it is to flip on a light, your data estate may be large and sprawling which requires an estate staff with specialized skill sets to manage each of those assets.  At SingleStore, we’ve encountered  scenarios where as many as 14 different types of datastores were involved in producing real-time assets and serving real-time analytics. Serving these diverse workloads on diverse data structures for real-time use cases is inherently inefficient. Big data becomes even bigger when it’s copied and moved rather than operated on in place and as it arrives from streaming sources. In terms of the “data estate”, we can reduce the liability and cost of creating, processing, and maintaining real-time assets by consolidating these workloads. There’s no need to give up on the convenience of instant availability in the cloud or the data access styles and structures you’ve grown accustomed to when designing your application. Many have already moved off single-node and special purpose databases to achieve greater efficiency by combining real-time operational and analytical workloads on diversely-structured data, from binary and raw text to key-value, document and relational. Such as sharing hardware at the cloud infrastructure level is resulting in higher server utilization and greater energy efficiency for data centers, building applications with a unified database that supports diverse workloads on diversely-structured data reduces your data estate’s liabilities. I argue that it also has the effect of increasing the value of the real-time data assets as well since designing SaaS applications with SingleStore reduces latency and stores data more efficiently through a combination of innovations than other datastores. Takeaway So, unplug those energy vampires in your home and across your data estate. Take a modern approach to cut your energy consumption. Consider the advantages you gain by combining real-time workloads into fewer datastores to not only simplify and accelerate your data, but also to conserve electricity and reduce the carbon footprint. By renewing and modernizing your data estate through reducing special purpose datastores, you’re directly following the environmental ethos of reduce, reuse, and recycle. I’ve said before that you must simplify to accelerate. Consider that by doing so, you may also “simplify to save”.
Read Post
10 Data Predictions for Your Data Strategy in 2021
Trending

10 Data Predictions for Your Data Strategy in 2021

With the world still recovering and reorganizing from COVID-19 and How the Global Pandemic has reshaped supply chains, infrastructure, and the way that business is processed, 2021 is anticipated to be a year to make big gains for companies that embrace and implement strategic Digital First Strategies.
Read Post
A Forrester & SingleStore Q&A: Using Real-Time Analytics to Prevent & Fight the COVID-19 Pandemic
Trending

A Forrester & SingleStore Q&A: Using Real-Time Analytics to Prevent & Fight the COVID-19 Pandemic

As a follow-up to our recent webinar with SingleStore Field CTO Domenic Ravita and our guest Forrester VP Principal Analyst Noel Yuhanna, we take a deeper dive into market-facing pandemic-relevant questions. Noel shares some of his key learnings over the past eight months. He talks about how businesses are learning to leverage data and analytics during COVID-19, the suddenly urgent need for real-time data, the benefits of a multi-cloud data and analytics strategy, and why companies must focus on customers as digital transformation races forward. What are the top questions organizations are asking about data and analytics during COVID-19? We have been getting quite a few inquiries about data and analytics, especially around cloud, open source, real time, automation, and modernization. Cloud is helping organizations to become more agile to improve customer experience, supply chain, and operational efficiency. Due to budget cuts, many organizations — including billion-dollar companies — are now seriously considering enterprise-wide cloud strategies including mission-critical applications. Open source has come up for discussion time and again. During the 2008 recession, organizations started to leverage various open source solutions for transactional and operational use cases. This year, open source is back on the forefront; even Global 2000 companies are now looking to leverage it for their tier-2 and tier-3 applications. We also see organizations asking more about real time, focusing on customer personalization and demand analysis for new products and services in various regions to control inventory. Besides, some organizations are looking to modernize their legacy platforms, especially as they realize their traditional platforms are failing to keep up with the new demands of self-service, real time, and connected data to support customer 360, risk analytics, and industry-specific analytics. Automation has also become a hot topic. Organizations want to do more with less when it comes to resources and process optimization. Businesses are seeking solutions that can help them automate data management and analytics functions to accelerate new insights. Why are real-time data and insights even more critical during this pandemic? Real-time data and insights have been growing in importance in general. As more organizations make progress in their digital transformations, more data is available about internal processes, customers, partners, supply chains, etc. Forrester has solid evidence that firms that are more advanced in using data analytics to make decisions are significantly more successful than those that are not. For example, firms that are rated ’advanced‘ in our insights-driven business maturity model are almost three times more likely to report double-digit year-on-year revenue growth than ’beginner‘ firms. They are also 1.4 times more likely to report that using analytics has reduced IT costs. So, to start with, it’s critical for businesses to advance their data and analytics competencies and capabilities. And real-time data and analytics are simply more effective, allowing businesses to sense and respond in real time to changes in their customers’ needs and behaviors, as well as to understand how their business is operating. With more digital activity moving to edge devices, computing has to be real time to take advantage of sensors and data that reflect what people and machines are doing. The pandemic amplified this need to a significant degree. Customer behavior changed drastically, making existing predictive models obsolete. Firms that had advanced competencies could find out in real time how product demand shifted, what customers turned to as supply problems emerged, and more. Real-time data supported contact tracing, showed where people were traveling to, and helped public sector organizations manage the impact of the pandemic. Firms that are advanced in being insights driven are 1.6 times more likely to report that big data has increased their business agility. Are more organizations embracing a multi-cloud data and analytics strategy? Yes. Our primary data and analytics survey data did not ask about multi-cloud strategies, but it did ask organizations about the most important components of their data strategies. The top two components were ‘big data integration’ followed by ‘public cloud big data services.’ Also, when asked how their spending was changing for their primary cloud deployments, about 65% of respondents reported increases in spending in all categories of BI and analytics projects for the next year. So, cloud remains a very big trend, and anecdotally from speaking with our clients, multi-cloud is seen as the way to avoid lock-in with a particular cloud vendor and to be able to deal with issues related to cloud vendors’ presence in different geographies, and to provide the best tools for different organizational cohorts. For some enterprises, having cloud implementations with multiple vendors occurs because of autonomous decision-making in different parts of the organization; for others, it is a conscious strategy to avoid lock-in and take advantage of various capabilities and offerings. Organizations that plan their architectures carefully are typically hybrid by design and multi-cloud by design. Why must companies be customer centric in the age of digital transformation? One of the main lines of research that Forrester has published is about how companies that are customer-obsessed will significantly outperform companies that are simply customer-aware — or worse. For 10 years Forrester has been telling clients that we are in the age of the customer. And, after a decade of the age of the customer, we know that businesses catalyze profitable transformation when they put the customer front and center in every decision. No matter how sophisticated your segmentation models, how rich your customer feedback platform, or how confident you are that you intuitively know what consumers want, you must continuously challenge your thinking about consumers for your business to thrive. To be successful, firms must deliver great customer experiences that resonate with each and every customer, and they must provide a consistent brand experience across digital and physical channels. They must maximize revenue and returns while constantly creating new consumer value. And they must compete in a world of disruption while deep-seated consumer needs remain the same. As digital transformations progress, the possibility of disruption increases and only an intense focus on the customer will enable a firm to thrive. This post was coauthored by Noel Yuhanna, Forrester VP Principal Analyst.
Read Post
Digital Twins: An Important Next Step In The Data Economy
Trending

Digital Twins: An Important Next Step In The Data Economy

In this Forbes article, Nikita Shamgunov, SingleStore co-founder & co-CEO, discusses the importance of digital twins. While digital twins are frequently associated with industrial and heavy machinery use cases, a digital twin can be built for any business or product. Digital twins can also be found in consumer electronics and financial services. Nikita explains that for a digital twin to work, a business must create a live mirror, or clone, of all the data that is flowing through the veins of the enterprise. The organization must consolidate that data and implement a single pane of glass across all data services, providing a complete view of the business. This, he adds, requires the business to embrace modern data management, which creates the foundation for these next-generation capabilities. Successfully creating and maintaining a digital twin, however, requires real-time data access and scalable data management, which SingleStore can help to provide. You can try SingleStore for free today or contact us.
Read Post
Digital Transformation Podcast: 10 Key Insights into Business in the COVID-19 Era
Trending

Digital Transformation Podcast: 10 Key Insights into Business in the COVID-19 Era

Listen in to this in-depth discussion with Raj Verma, SingleStore co-CEO, and Kevin Craine, host of the Digital Transformation Podcast. This 25-minute podcast interview covers how digital transformation has shifted from being a “nice-to-have” to a tactical “must-do.” Raj discusses how software and data have helped businesses plan strategies to navigate their COVID-19 response and recovery. He shares advice, and a call-to-action for companies to reframe their business strategies. Listen here for the full interview: Data Strategies for a Post-COVID Business World. In the COVID-19 era, businesses are undergoing digital transformations much faster than anyone could have predicted.  When we look back at this period of time, I think we will find that COVID-19 is going to make us stronger, as companies, as individuals, and economically. To hear more about the 10 key areas that Kevin and Raj explore during this conversation, click below.
Read Post
SingleStore Establishes Commitment to Latin American Market with LATBC Strategic Partnership
Trending

SingleStore Establishes Commitment to Latin American Market with LATBC Strategic Partnership

The following press release has been issued by SingleStore today – Wednesday, July 22nd, 2020 – to announce the strategic partnership between SingleStore and top Latin American technology consultancy LATBC. SingleStore Establishes Commitment to Latin American Market with LATBC Strategic Partnership Established Company to Provide Service, Support in Mexico, Central America, and the Caribbean SAN FRANCISCO – July 22, 2020 – SingleStore, The Single Database for All Data-Intensive Applications for operational analytics and cloud-native applications, has forged a new partnership with Latin America Business Consulting (LATBC) as the exclusive reseller for SingleStore in Mexico, Central America, and the Caribbean. LATBC, which has been operating in the region since 2003, will provide pre-sales and post-sales services, delivery, and support. This relationship extends the reach of the SingleStore team in this important region. Working with LATBC better positions the company to deliver cutting-edge technologies, enabling businesses in this part of the world to be globally competitive in this new era – one in which data has become an organizations’ most important asset. The SingleStore-LATBC partnership provides customers with an opportunity to do business with a local entity that offers promotions, service, and support suited to their needs. SingleStore’s selection process for its partner ecosystem requires that the partner company is aligned strategically with SingleStore, and adheres to the perfect blend of technological skill sets, geographic coverage, and company culture to deliver a seamless customer experience. “We are excited to welcome LATBC as a SingleStore strategic partner,” said SingleStore co-CEO Raj Verma. “LATBC embodies our ideal partner characteristics and is now part of our extended team in the Latin America region, representing SingleStore at the highest levels of the business sphere.” SingleStore’s desire to create this partnership was motivated in large part by an interest in working with Xavier Espinosa de los Monteros, founder of LATBC, and currently CEO and executive chairman. Espinosa de los Monteros is well-known and respected in the market, in which LATBC provides data and analytics solutions to important industry verticals such as financial services, media, and telecommunications. “Our commitment at LATBC is to help clients create bridges that move them into the future and onto new and valuable paths,” said Espinosa de los Monteros, LATBC CEO and executive chairman of the board. “Innovation is part of our DNA. LATBC’s ongoing push to embrace new models, our significant experience in the region, and the power of SingleStore’s groundbreaking technology will forge a path for our companies – and our customers – to build and benefit from the new data economy.” About LATBC LATBC is a top technology consulting firm committed to data. Over the past 17 years they have delivered more than 250 successful projects in the US and Central & South America, with thousands of incredible collaborators. LATBC is a true believer in the value of data. Their goal is to make data the rainmaker, the power, the engine and the fuel of every enterprise. Big is not enough in data; what matters is how you take action with it. Visit latbc.com or follow us @latbc. About SingleStore SingleStore is The Single Database for All Data-Intensive Applications, powering modern applications and analytical systems with a cloud-native, massively scalable architecture. SingleStore delivers maximum ingest, accelerated transaction processing, and blisteringly fast query performance, including AI integration and machine learning models, all at the highest concurrency. Global enterprises use the SingleStore distributed database to easily ingest, process, analyze, and act on data, to thrive in today’s insight-driven economy. SingleStore is optimized to run on any public cloud, or on-premises with commodity hardware. Visit www.singlestore.com or follow us @SingleStoreDB.
Read Post
SingleStore Expands Collaboration With Amazon Web Services, Joins ISV Workload Migration Program
Trending

SingleStore Expands Collaboration With Amazon Web Services, Joins ISV Workload Migration Program

The following press release has been issued by SingleStore today – Wednesday, July 15th, 2020 – to announce expanded collaboration between SingleStore and AWS.SingleStore Expands Collaboration With Amazon Web Services, Joins ISV Workload Migration ProgramNew Effort Will Enable Customers to Accelerate the Move to Cloud-Based Operational AnalyticsSAN FRANCISCO – July 15, 2020 – SingleStore, The Single Database for All Data-Intensive Applications for operational analytics and cloud-native applications, has expanded its collaboration with Amazon Web Services (AWS) by joining the AWS ISV Workload Migration Program (WMP). By participating in this program, which helps AWS Partner Network (APN) Technology and Consulting Partners migrate independent software vendor (ISV) workloads to AWS via a repeatable migration process, SingleStore will accelerate the customer journey to the cloud for operational analytics.“Our customers include global enterprises that are leaders in their industries. Having SingleStoreDB Self-Managed 7.1 available on the AWS platform, which is suited for all types of workloads, gives customers easy access to the fastest, most scalable SQL database in the world,” said SingleStore co-CEO Raj Verma. “SingleStore is The Single Database for All Data-Intensive Applications, providing solutions for all enterprise workloads offering speed, scale and SQL. That makes this a uniquely powerful combination poised to fuel expansion for both companies.”The expanded collaboration brings significant value to SingleStore customers and the companies themselves. Together, AWS and SingleStore provide a blueprint for adoption, robust technology and other resources to enable customers to harness their operational data at scale.SingleStore’s modern data architecture can leverage the power of AWS services like Apache Kafka and Amazon SageMaker for streaming data, artificial intelligence (AI) and machine learning (ML). SingleStore also leverages solutions like Amazon Elastic Cloud Computing (Amazon EC2) and Amazon Simple Storage Service (Amazon S3) to provide full extensibility for real-time operational analytics.Broadening this collaboration comes at a particularly opportune time, as Gartner expects that three-fourths of databases will be deployed or migrated to a cloud platform by 2022. This trend will be largely due to databases used for analytics via the software-as-a-service (SaaS) model.AWS recently selected SingleStore as one of five technologies it is highlighting in the AWS Global Financial Services Campaign. The campaign illustrates SingleStore’s ability to support financial services companies in managing time series data.AWS and SingleStore first collaborated in 2018. In addition to its membership in the AWS ISV WMP, SingleStore is an Advanced Technology Partner in the APN, a global program for technology and consulting businesses that leverage AWS to build solutions and services for customers.SingleStore’s bring-your-own license and metered offerings are available in AWS Marketplace, and the company has successfully transacted private offers through that platform. SingleStoreDB Cloud – a fully managed, on-demand, elastic cloud database – also runs on AWS.“Businesses in the financial services, manufacturing and telco spaces are looking for proven and innovative partners for reducing the time-to-insight with the scalability and convenience of the cloud. Decision velocity — the ability to make faster decisions — is paramount for every organization,” said R “Ray” Wang, principal analyst, founder and chairman of Constellation Research. “This relationship brings all those components together.”About SingleStoreSingleStore is The Single Database for All Data-Intensive Applications, powering modern applications and analytical systems with a cloud-native, massively scalable architecture. SingleStore delivers maximum ingest, accelerated transaction processing and blisteringly fast query performance, including AI integration and machine learning models, all at the highest concurrency. Global enterprises use the SingleStore distributed database to easily ingest, process, analyze and act on data, to thrive in today’s insight-driven economy. SingleStore is optimized to run on any public cloud or on-premises with commodity hardware. Visit www.singlestore.com or follow us @SingleStoreDB.ContactGina von Esmarch 415-203-4660 gvon@singlestore.com
Read Post
A HOAP-ful New Perspective from 451 Research
Trending

A HOAP-ful New Perspective from 451 Research

Analyst firm 451 Research has come up with new research that sees a bright future for HOAP – hybrid operational and analytical processing. The report is titled 451 Perspective: A HOAP-ful future for hybrid operational and analytical processing. This new type of processing has received several different names, from different analyst firms and consultancies: HTAP (Gartner)Translytical processing (Forrester)HOAP (451 Research)Operational analytics (common amongst other analyst firms) By any of these names, this new style of data processing – which unifies transactions and operational analytics, including many data warehousing-type functions – is widely believed to have a bright future. And SingleStore is right in the middle of it. What HOAP Replaces In a previous report, 451 Research identified HOAP as an emerging category. Now, they see HOAP experiencing broad adoption. HOAP seeks to unify two formerly separate data processing categories, and to largely eliminate the need for a third: Online transaction processing (OLTP). Transaction processing systems have various needs for reliability, with requirements which begin at strong – “nearly always works” – to absolute -“must work every time, across time zones and disparate data centers, safeguarding against serious financial and reputational consequences for data loss or significant downtime.”Online analytics processing (OLAP). Analytics processing systems, which include data warehousing systems, data marts, and data shoeshine stands (just kidding), typically work on copies of existing data. They must be fast, reliable, scalable to multiple apps and users, and affordable, with SQL support.Extract, transform, and load (ETL). Systems which transform and format data from ingest or OLTP systems to OLAP systems have become a separate category of their own. Using an ETL system reduces the requirements for the OLTP and OLAP systems that an ETL product connects to, but adding a third system to the mix increase cost and complexity, as well as ensuring significant end-to-end latency. One of the advantages that 451 Research cites for OLTP systems, and their use of rowstore tables, is the ability to handle complex queries with joins. And they also cite architectural advantages to the separation of transactions, given that these are well suited to rowstore tables, and analytics, which usually benefit from the use of columnstore tables. SingleStore, however, fuzzes over these distinctions – and, with Universal Storage, is on track to nearly eliminate them. SingleStore not only supports both table types in a single database; it supports joins, and other operations, for rowstore tables, columnstore tables, and across table types.
Read Post
Cloud Database Trend Report from DZone Features SingleStore
Trending

Cloud Database Trend Report from DZone Features SingleStore

A new Trend Report from DZone highlights the move to cloud databases. The report paints a bright picture for cloud database adoption, with roughly half of developers asserting that all of their organization’s data will be stored in the cloud in three years or fewer. You can get a copy of the report from SingleStore. DZone has issues a new trend report on cloud databases. In the report, leaders in the database space focus on specific use cases, calling out the factors that help you decide what you need in any database, especially one that’s in the cloud. The advantages of cloud databases include flexibility to scale up and scale back, easier backups of data, moving database infrastructure out of house, and offloading some database maintenance. SingleStore is a database that runs anywhere Linux does, on-premises and on all three major cloud providers – AWS, Google Cloud Platform (GCP), and Microsoft Azure. SingleStoreDB Cloud is a managed service with the SingleStore database at its core. SingleStoreDB Cloud is available on AWS and GCP, with Azure support to follow soon. The SingleStore Kubernetes Operator gives you the flexibility to manage this cloud-native database with cloud-native tools. SingleStore is also a fast, scalable SQL database that includes many features that are normally claimed only by NoSQL databases: easy scalability, fast ingest, fast query response at volume, and support for a wide range of data types, especially JSON and time series data. Between SingleStoreDB Self-Managed (the version you download and run on Linux), and SingleStoreDB Cloud, the advantages of cloud databases – scalability, easy and reliable backups, and moving both infrastructure and maintenance out-of-house – are readily available, on a solution that’s also identical on-premises. The report points out several interesting facts: Slightly more than half of organizations that have a cloud database solution in place have had one for two years or less.More than two-thirds of cloud database users either use multiple clouds (40%) or are seriously considering doing so (26%).Analytics is the #1 reason for moving databases to the cloud, with modernization of existing apps and becoming cloud native also ranking highly.The database as a service (DBaaS) model, represented by SingleStoreDB Cloud and many other options, has a slight lead over those who use a self-managed database.About half of respondents believe all of their data will be in the cloud in three years or fewer.
Read Post
SingleStore as a Data Backbone for Machine Learning and AI
Trending

SingleStore as a Data Backbone for Machine Learning and AI

SingleStore co-founder and co-CEO Nikita Shamgunov gave the keynote address and a session talk at the AI Data Science Summit in Jerusalem earlier this summer. The session talk, presented here, describes the use of SingleStore as a data backbone for machine learning and AI. In this talk, he gives a demonstration of the power of SingleStore for AI, describes how SingleStore relates to a standard ML and AI development workflow, and answers audience questions. This blog post is adapted from the video for Nikita’s talk, titled Powering Real-Time AI Applications with SingleStore. – Ed.
Read Post
Video: Modernizing Data Infrastructure for AI and Machine Learning
Trending

Video: Modernizing Data Infrastructure for AI and Machine Learning

The AI Data Science Summit 2019 featured a keynote by SingleStore’s CEO, Nikita Shamgunov, where he was hosted by SingleStore partner Twingo. Nikita, a co-founder of SingleStore and the technical lead from the beginning, has shepherded SingleStore’s development toward a world where cloud, AI, and machine learning are leading trends in information technology. Now that these trends are becoming predominant, SingleStore is playing an increasing role, as Nikita discussed in his keynote. What follows is an abbreviated version of his presentation, which you can view in full here. – Ed. Today I want to talk about the demands of AI and machine learning data infrastructure. Certainly the promise is very big, right? I couldn’t be more excited about all the innovation that’s coming in retail, in health care, in transport and logistics.
Read Post
How CEOs Can Stay Relevant in the Age of AI
Trending

How CEOs Can Stay Relevant in the Age of AI

The most important new skills for business leaders are not what you might think. You’ve read the headlines. Data is the new oil; it’s the new currency; data capital can create competitive advantage. We also hear, over and over again, that machine learning (ML) and artificial intelligence (AI) are the future. Few will dispute that these things are true, but the trite language masks a deeper challenge. Data must be collected, analyzed, and acted upon in the right way and at the right time for a business to create value from it. ML and AI are only as powerful as the data that drive them. In this world, big companies – which may throw away as much data in a day as a startup will generate in a year – should have a significant competitive advantage. However, new approaches are needed to move forward effectively in the age of ML and AI. And the new approaches all start with the data itself. To build and maintain a successful business in today’s insight-driven economy, business leaders need to develop a new set of skills. We outline what we believe those skills are below. Skill #1: A Drive to Find Key Data and Make it Useful Business leaders need to be on a mission to collect and (more importantly) expose to their organizations all of the data that might create a competitive advantage. We don’t always know exactly what data or insights might be the ones that will allow us to break away from the pack until after we have analyzed and acted on that data, then measured the results and repeated the cycle, over and over. Business leaders need to encourage collecting as much data as possible in the day-to-day operations of the business, with a particular eye towards where your organization has advantages or challenges. Make sure that the data is not simply collected, but stored in such a way that your teams can easily access, understand, and analyze it. “Big data” was a great start to enabling the future of our businesses, but what we need today instead is “fast data” – data made available to everyone, to drive fast insight. Skill #2: The Ability to Create a Culture of Constant Analysis and Action As the French writer Antoine de Saint-Exupéry stated, “If you want to build a ship, don’t drum up people together to collect wood and don’t assign them tasks and work, but rather teach them to long for the vast and endless sea.” This adage applies to becoming an insight-driven business. Data is not insight, and insights are not outcomes. What we seek in collecting and analyzing data is to identify and carry out the actions that will accelerate and transform our business. The best way to leverage data for creating competitive advantage is to encourage a culture of inquisitiveness, of always asking “the 5 Whys” – a series of “why” questions that take us to the root of what’s important, and why. Compel your teams to constantly look for ways to not just gather and share insights, but to look for ways to turn insights into immediate actions that add value to the business. Innovations such as ecommerce product recommendations, dynamic pricing based on demand, or sensor-based maintenance are all insight-driven innovations that have arisen in the last decade or so and that have generated dramatic competitive advantage. ML and deep learning – the most practical form of AI currently available to business – accelerate this process. You can use them together to test multivariate alternatives, to vary assumptions and audiences around your current performance, to help you maximize the value of the insights that you find and implement today, and then to help you take your insights to another higher level. Skill #3: The Insight to Choose the Right Tools and Technologies The agile movement does not get nearly enough credit for the transformative effect it’s had, and continues to have, on business. But a business can only be agile with the right tools and technologies, and the ability to use them to drive action and change. It’s no surprise that, up to this point, most of the companies and leaders that are making the best use of data to drive their businesses are digital natives – think Google, Facebook, Uber, Airbnb, et al. They have done this by applying the agile mindset of software development to data architecture, data engineering, and data-driven decisioning. While the large digital players may have leapt to the forefront in the last 10 years, the traditional enterprise can use its long operational history, its existing volumes of data, and its ability to generate fresh, useful data, to level the playing field and compete effectively in the modern economy. In order to maximize and utilize these resources, business leaders need to lead the decision making around data infrastructure. The insight-driven enterprise needs the best possible tools and technology to enable fast, flexible, and efficient use of the company’s data. This means shifting the traditional IT mindset from maintaining legacy data infrastructure, overly strict controls, and inflexibility, to one that puts agility first. Analysts, data scientists, and application developers need access to real-time or near-real-time data sources. And they, and the businesspeople who work with them most closely, need to be empowered to act on that data – be it for rapid decision making or to create insight-driven, dynamic experiences for customers and employees. This shift requires a new set of tools, processes, and culture that is so critical to the future of the business that business leaders – all the way up to the CEO – needs to ensure that agility is the primary order of the day. Peter Guagenti is CMO at SingleStore, and is an advisor and a board member for several AI-focused companies. Peter spent more than a decade helping Fortune 500 companies to embrace digital transformation and to use real-time and predictive decisions to improve their businesses.
Read Post