Types of OLAP Systems

What Are The Types of OLAP Systems?

OLAP systems vary quite a lot, and they have generally been distinguished by a letter tagged onto the front of the acronym “OLAP,” for On-Line Analytical Processing. MOLAP and ROLAP have classically been the most established types, and the other distinctions represent little more than the marketing programs on the part of the vendors to distinguish themselves, for example, SOLAP and DOLAP. Here, we aim to give you an idea of what these distinctions have meant.

 

The New Direction in OLAP Technology:

The newest software in the OLAP and Business Intelligence world combines, in real-time, the benefits of both relational tables and multidimensional business data modeling.  The latest technology removes the proprietary format of its MOLAP predecessors by living/saving in source relational tables, like SQL Server.  Lastly, new OLAP technology maintains a constant connection with existing back-end systems and delivers immediately responsive reports/analytics in Excel and other front-end tools (dashboards, query tools, etc.)

If you like the sound of that, check out Olation® from PARIS Tech, the sponsor of OLAP.com

 

Major OLAP Technology Types:

 Hybrid Transaction / Analytical Processing (HTAP)

Gartner coined the term HTAP in a paper in the beginning of 2014 to describe new in-memory data systems that do both online transaction processing (OLTP)  and online analytical processing (OLAP).

HTAP relies on newer and much more powerful, often distributed, processing: sometimes it involves a new hardware “appliance”, and it almost always requires a new software platform. Beyond this, the key point seems to be that all the technology is sited in the relational database.  And so, there’s no more data replication, and new transactional information becomes part of an analytical model in as fast a time as is technologically possible.

HTAP represents a new way to tie data together in a way that hasn’t been possible before– a real uniting of relational data stored in tables with the data models that are used for decision making by the business leaders.

For an example of an HTAP product, check out Olation® from PARIS Tech, the sponsor of OLAP.com. Olation can be categorized as an HTAP product — even the name Olation implies the combination of “OLAP” and “relational” technologies.

Multidimensional OLAP (MOLAP) – Cube based

MOLAP products enable end-users to model data in a multidimensional environment, rather than providing a multidimensional view of relational data, as ROLAP products do (see next tab).

The structure of a multidimensional model is not a series of tables (as exists in a relational database) but what is generally referred to as a cube. Cubes modeled in a multidimensional database extend the concept associated with spreadsheets: just as a cell in a spreadsheet represents the intersection of two dimensions (sales of product by region), a cell in a cube represents the intersection of an infinite number of dimension members (e.g., Products, Customers, Regions, Months …nth dimension). As in a spreadsheet, a cell might be calculated by formulas involving other cells.

In short, multidimensional databases allow users to add extra dimensions, rather than additional tables, as in a relational model. And the MOLAP cube structure allows for particularly fast, flexible data-modeling and calculations. For one, locating cells is vastly simplified—an application can identify a cell location by name (at the intersection of dimension members) rather than by searching an index or the entire model (via SQL SELECT statements), as in a relational database.  Further, multidimensional models incorporate advanced array-processing techniques and algorithms for managing data and calculations. As a result, multidimensional databases can store data very efficiently and process calculations in a fraction of the time required of relational-based products.

What are the perceived drawbacks of MOLAP tools?

For one, relevant data must be transferred from relational systems,which is aa potentially “redundant” re-creation of data in another (multidimensional) database. Once data has been transferred, there may be no simple means for updating the MOLAP “engine” as individual transactions are recorded by the RDBMS. Also, MOLAP products are typically proprietary systems. For some IT departments, introducing a new database system is an anathema, even if it means significantly greater productivity for the type of planning, analysis and reporting that end-users rely on the (MOLAP) solution to perform.

For a good example of a fast, scalable MOLAP product, check out PowerOLAP® from PARIS Tech, the sponsor of OLAP.com.

Relational OLAP (ROLAP) –Star Schema based

ROLAP products (for Relational OLAP) are credited with being able to directly access data stored in relational databases. The notion is that they can readily retrieve transactional data, although this becomes suspect when very large data sets are in play, or if more complex calculations are to be delivered, based on the transactional data. ROLAP products enable organizations to leverage their existing investments in RDBMS (relational database management system) software.

ROLAP products access a relational database by using SQL (structured query language), which is the standard language that is used to define and manipulate data in an RDBMS. Subsequent processing may occur in the RDBMS or within a mid-tier server, which accepts requests from clients, translates them into SQL statements, and passes them on to the RDBMS.

ROLAP products provide GUIs and generate SQL execution plans that typically remove end-users from the SQL writing process. However, this over-reliance on processing via SQL statements—including processing for multidimensional analysis—is a drawback. Whether it is generated “transparently” or not, SQL is the language of relational tables: SQL’s vocabulary is limited and its grammar often inflexible, at least to accommodate the most sophisticated modeling required for multidimensional analyses.

There are further drawbacks to structuring a multidimensional model solely within relational tables: Before end-users can submit requests, the relevant dimension data must be extracted and reformatted in de-normalized structures known as star schema or snowflakes (so-called because of the way the tables are conjoined). These tabular structures are necessary to provide acceptable analytical performance. Sophisticated ROLAP applications also require that aggregate tables be pre-built and maintained, eliminating the need to process summary data at runtime

One advantage of ROLAP over the other styles of OLAP analytic tools is that it is deemed to be more scalable in handling huge amounts of data. ROLAP sits on top of relational databases therefore enabling it to leverage several functionalities that a relational database is capable of.

Hybrid OLAP (HOLAP)

HOLAP is the product of the attempt to incorporate the best features of MOLAP and ROLAP into a single architecture. This kind of tool tries to bridge the technology gap of both products by enabling access to or use of both multidimensional database (MDDB) and Relational Database Management System (RDBMS) data stores. HOLAP systems store larger quantities of detailed data in the relational tables while the aggregations are stored in the pre-calculated cubes. HOLAP also has the capacity to “drill through” from the cube down to the relational tables for delineated data.Some of the advantages of this system are better scalability, quick data processing and flexibility in accessing of data sources. The issue with HOLAP systems lies precisely in the fact that they are hybrids: at best they partake of the strengths of other systems…but they also evince the weaknesses of each, in an attempted mashup of two distinct technologies.

Other Types:

There are also less popular kinds of OLAP technology that one might encounter every so often, listed below. Some of these self-designated product types do not really exist any longer. (An example is WOLAP, since nearly all products provide a Web interface, to meet market demand.) But they are included, to help provide a backgrounder in how vendors have tried to set themselves apart, and also how the OLAP market developed over time.

Desktop OLAP (DOLAP)

Desktop OLAP, or “DOLAP,” is based on the idea that user can download a section of an OLAP model from another source, and work with that dataset locally, on their desktop. DOLAP is purportedly easier to deploy, with a potential lower cost, but almost by definition comes with a limited functionality in comparison with other OLAP applications.

Web OLAP (WOLAP)

Simply put, a WOLAP signifies a Web browser – based OLAP technology. And it suggests a technology that is Web-based only, without any kind of option for a local install or local client to access data.  The most appealing features of this style of OLAP was (past tense intended, since few products categorize themselves this way) the considerably lower investment involved on the client side (“all that’s needed is a browser”) and enhanced accessibility to connect to the data. The fact is that by now most OLAP products provide an option for Web-only connectivity, while still allowing other client options for more robust data modeling and other functionality than a Web client can provide.

Mobile OLAP

Mobile OLAP is merely refers to OLAP functionalities on a wireless or mobile device. This enables users to access and work on OLAP data and applications remotely thorough the use of their mobile devices.

Spatial OLAP (SOLAP)

The aim of Spatial OLAP (thus, SOLAP) is to integrate the capabilities of both Geographic Information Systems (GIS) and OLAP into a unified solution, thus facilitating the management of both spatial and non-spatial data. The driving idea is to provide quick exploration of data that can point to trends and analysis in a geographic context, whether place-names sourced from a GIS or overlaying maps that show, for example, customer purchase behavior.

 

FREE OLAP & BI Software

Free Download of PowerOLAP Personal
New Call-to-action

More Articles on OLAP & Business Intelligence

11 of the Best Practices for Business Intelligence

Dennis McCafferty of CIO Insight recently wrote an article that addresses 11 of the top practices of Business Intelligence. With Business Intelligence controlling such key factors in today’s companies such as, analytics, business performance management, text mining and predictive analytics, it is crucially important to understand it. Let’s take a look into CIO Insight’s 11 best practices and see if you are already taking advantage of these.  Bigger Isn’t Always Better: Just because a solution can gather a large amount of data doesn’t mean that they are helping you get the most out of the data. McCafferty thinks that trustworthiness and immediacy are the key elements. Deliverable Value Over TCO: When your BI solution can deliver specific ROI, you will gain higher buy-in no matter the initial total cost of ownership. Take Stock of Current Resources: Taking advantage and leveraging the IT that your company already owns to support your BI solution is a top practice. You can then utilize that spending on something else that will make a larger impact. File-Formatting Resources: Since Business Intelligence uses more than 300 file formats, it is important that you are prepared and ready to use any one of them. Create BI Policies for Deployment: It is important to have BI policies in place such as how the data is collected, processed and stored. This will ensure higher level of relevance and accessibility. Go Team, Involve Business Leaders From the Outset: You need to remain on the same page as all of the different leaders and work as a big team to keep IT on the right path. The Only Constant? Change: Every thing...

28% of a Data Analyst’s Time is Spent on Data Prep

  James Haight of Blue Hill Research recently wrote a blog post that breaks down the costs and numbers of Data Preparation. Typical reports focus on hours and efficiency. As stated by James Haight, “At Blue Hill, we recently published a benchmark report in which we made the case that dedicated data preparation solutions can produce significant efficiency gains over traditional methods such as custom scripts or Microsoft Excel.” According to their study, data analysts spend on average about 2 hours per/day just on data preparation alone. When Blue Hill Research factored in the average salary that a data analyst makes in the U.S., it comes out to be around $62,000 per year. After doing the rough calculations, they figured out that 28% of a data analyst’s time is spent preparing data, which equals about $22,000 worth of their yearly salary. While that number seems high just considering one analyst, you can imagine how drastic that number looks when you figure in how many data analysts there can be at larger corporations. In this post, they breakdown the numbers even more. For example, say a company has 50 data analysts which is estimated that $1,100,000 is spent annually just on preparing data. In order for data analysts to shift their full attention to where it should be, “analyzing the actual data”, there needs to be a solution implemented. PARIS Technologies has the solution. PowerOLAP is a software that was designed to take the stress and time out of preparing and comparing data. It has the capability to aggregate information from any source into a multidimensional PowerOLAP database. It empowers...

Time for a Data Plan

Data planning is quickly becoming a top priority in businesses across the globe. Ben Rossi dives into some key components that are making it vital for organizations to manage their data. According to Rossi’s post, there are two main components that factor into this. The first one is the increasing amount of data that is being pulled into organizations for analysis. As time progresses, so does the high volume of data and it is only speeding up as time ticks forward. Large quantities of data and information is a great thing but in order to retain any value from it, it must be managed the correct way. Organizations are being faced with tougher compliance policies which is requiring more effort in maintaining data for a much longer amount of time. Not only are businesses overflowing with large quantities of data but now must solve the issue of, where can all of this data be stored. Rossi provides the example of large credit card companies. In the past, they were required to keep the data records of all credit card transactions for seven years. But now there has been recent talk of extending that to 10 or possibly more years. Data planning can have a big positive impact on a company as a whole, but planning is essential to success.  The proper planning ensures that things such as cloud storage and prioritizing levels of data for storage within one’s network are all properly set up. Planning out the process and details for proper employee data access is crucial, too. It is important to figure out the limits and accessibility of data...

6 Things to Help you Tackle IoT and Big Data

So, you have made the decision to dive into the world of IoT and Big Data? Where to start is the major question and can seem overwhelming. Preston Gralla has come up with some key steps in making the decision or updating your current solution program in his article, 6 Tips for Working with IoT and Big Data. The first clear way to dive in is to know the problem you are facing and what the end result looks like to you. Without a crystal clear picture of what you have to solve, a project can easily head in a different direction or take longer than expected to go into implementation. Then, you must deploy the “right people” on the project. Gralla states that Data scientists can be expensive to employ and hard to come by, because, today, they are so much in demand. Instead he suggests that you seek the resources within your company. Employees with the Big Data and IT experience may have the drive and motivation to learn new techniques in order to take on new projects. Next, Gralla talks about how important it is to know exactly which data you will collect and also how it will be stored. In order to get the most from your analytics, it is important to be working with precise data that will give you the most accurate results and ROI. Data can be made up of complex layers and other times, it can be a simple layer of information. To ensure that it will work compatibly with each other, Gralla suggests that businesses build an extra abstract layer to allow room for extra...

Healthcare and Big Data are Not Slowing Down

The days of paper charting medical records are long gone. Michael Morrison gives a good look into why Big Data is vital to the Healthcare industry in his post, “Big Data Remains Hot in Health Care.” Medical firms did not have much choice in adapting to these changes after the Health Information Technology for Economic and Clinical Health Act in the US in 2009 took effect, this was one of the largest efforts ever in improving patient care, ensuring proficient operations, and closing the gap of medical error. The result has been phenomenal and continues to progress. With the ability of the medical industry to continuously run analytics, they gain critical information from high volumes of medical records and data. Morrison mentions that with advancements in patient data analytics, it is becoming the most important aspect of implementing medical analysis, which allows for patient care to be personalized. Many common road blocks that are normal in the course of  patient care and health management can be corrected with the use of Big Data tools. The National Cancer Institute has created a Big Data project that has, through Big Data research, improved cancer treatments. They are now able to learn a lot about the patient’s responses to different medications and choose the best treatment course for a specific patient. The implementation of data solutions across the entire medical industry benefits everyone involved from the provider to the patient. The next time you make a visit to your local doctor’s office or hospital, take a look around and notice how much we rely on the ongoing advances of technology. Where would be now without all of it; how much safer...
Skip to toolbar