OLAP and Hadoop: A Great Pairing
OLAP continues to be a relevant and exciting technology, most recently in pairing OLAP and Hadoop. As we are OLAP.com, we have ALWAYS seen the value of OLAP technology. We admit OLAP has been a bit out of style the last few years. Some companies even run Google ads about how “OLAP is obsolete,” but nothing could be further from the truth. (Check out our blog on that one.)
We see this in the fashion industry all the time: what is old is new again! This is rare in the technology realm, but it seems to be the case with OLAP. As developers struggle to get value out of Hadoop data, they discovered they needed the speed and flexibility of OLAP. OLAP and Hadoop is a powerful combination for getting to the ultimate goal of extracting value from Big Data.
Bringing OLAP to scale for Big Data
In an article from ZDNet, Is this the age of Big OLAP? Andrew Brust writes about the new relationship between OLAP and Hadoop. He highlights that OLAP technology can be particularly beneficial when working with extremely large Big Data sets. Typically, OLAP has not been scalable enough for Big Data solutions. But OLAP technology continues to progress, we find this new application of OLAP exciting. Brust discusses a few strategies for bringing the two technologies together. He mentions a few OLAP vendors in detail and how they manage the issue of scalability for OLAP software.
If you want to try using OLAP with Hadoop, perhaps you want to give PowerOLAP, the mature OLAP product of OLAP.com, a try? There is a free version of PowerOLAP available. If you plan to test PowerOLAP with your Hadoop, contact PARIS Tech, and they will lift the member limit for you in the free version, as you will need to go beyond the member limit that ships with the free version.
In sum, OLAP.com is pleased to see OLAP rising in relevance once again and getting some of the recognition we felt it deserved all along. It is a testament to the power and value OLAP has as a technology.
The Power of OLAP and Excel
Should Excel be a key component of your company’s Business Performance Management (BPM) system? There’s no doubt how most IT managers would answer this question. Name IT’s top ten requirements for a successful BPM system, and they’ll quickly explain how Excel violates dozens of them. Even the user community is concerned. Companies are larger and more complex now than in the past; they are too complex for Excel. Managers need information more quickly now; they can’t wait for another Excel report. Excel spreadsheets don’t scale well. They can’t be used by many different users. Excel reports have many errors. Excel security is a joke. Excel output is ugly. Excel consolidation occupies a large corner of Spreadsheet Hell. For these reasons, and many more, a growing number of companies of all sizes have concluded that it’s time to replace Excel. But before your company takes that leap of faith, perhaps you should take another look at Excel. Particularly when Excel can be enhanced by an Excel-friendly OLAP database.That technology eliminates the classic objections to using Excel for business performance management.
Excel-friendly OLAP products cure many of the problems that both users and IT managers have with Excel. But before I explain why this is so, I should explain what OLAP is, and how it can be Excel-friendly. Although OLAP technology has been available for years, it’s still quite obscure. One reason is that “OLAP” is an acronym for four words that are remarkably devoid of meaning: On-Line Analytical Processing. OLAP databases are more easily understood when they’re compared with relational databases. Both “OLAP” and “relational” are names for a type of database technology. Oversimplified, relational databases contain lists of stuff; OLAP databases contain cubes of stuff.
For example, you could keep your accounting general ledger data in a simple cube with three dimensions: Account, Division, and Month. At the intersection of any particular account, division, and month you would find one number. By convention, a positive number would be a debit and a negative number would be a credit. Most cubes have more than three dimensions. And they typically contain a wide variety of business data, not merely General Ledger data. OLAP cubes also could contain monthly headcounts, currency exchange rates, daily sales detail, budgets, forecasts, hourly production data, the quarterly financials of your publicly traded competitors, and so on.
You probably could find at least 50 OLAP products on the market. But most of them lack a key characteristic: spreadsheet functions.
Excel-friendly OLAP products offer a wide variety of spreadsheet functions that read data from cubes into Excel. Most such products also offer spreadsheet functions that can write to the OLAP database from Excel…with full security, of course.
Read-write security typically can be defined down to the cell level by user. Therefore, only certain analysts can write to a forecast cube. A department manager can read only the salaries of people who report to him. And the OLAP administrator must use a special password to update the General Ledger cube.
Other OLAP products push data into Excel; Excel-friendly OLAP pulls data into Excel. To an Excel user, the difference between push and pull is significant.
Using the push technology, users typically must interact with their OLAP product’s user interface to choose data and then write it as a block of numbers to Excel. If a report relies on five different views of data, users must do this five times. Worse, the data typically isn’t written where it’s needed within the body of the report. Instead, the data merely is parked in the spreadsheet for use somewhere else.
Using the pull technology, spreadsheet users can write formulas that pull the data from any number of cells in any number of cubes in the database. Even a single spreadsheet cell can contain a formula that pulls data from several cubes.
At first reading, it’s easy to overlook the significant difference between this method of serving data to Excel and most others. Spreadsheets linked to Excel-friendly OLAP databases don’t contain data; they contain only formulas linked to data on the server. In contrast, most other technologies write blocks of data to Excel. It really doesn’t matter whether the data is imported as a text file, copied and pasted, generated by a PivotTable, or pushed to a spreadsheet by some other OLAP. The other technologies turn Excel into a data store. But Excel-friendly OLAP eliminates that problem, by giving you real-time data for a successful BPM system.
To learn more about OLAP, click here.
“There’s nothing inherently wrong with spreadsheets; they’re excellent tools for many different jobs. But data visualization and data communication is not one of them.” – Bernard Marr
We couldn’t agree more with what Bernard is saying in his article, “Why You Must STOP Reporting Data in Excel!” Excel is everywhere and it has proven to be a valuable resource to every company across the globe. The problem is that many companies are using spreadsheets as their main line of communication internally. Excel is great at displaying all of the raw data you could possibly dream of, just ask any Data Analyst, who eats, sleeps and dreams of never-ending spreadsheets. Bernard gets right to the point and lays out the top 4 reasons that spreadsheets are not the right fit for visualizing data and communication within an organization.
Most people don’t like them.
Bernard makes a great point, unless you work with Excel frequently like a data analyst, it has the reputation of being intimidating. Employees will be reluctant to use it, let alone even think about analyzing data from it. If employees are not clerking in Excel all day, they are most likely going to give Excel the cold shoulder when it comes to communicating data.
Important data is hidden.
I think it is safe to agree with Bernard on this. Spreadsheets are not the best visualization tool out there. Most spreadsheets today are full of endless numbers. If users can’t look at the data and quickly decipher valuable vs. non-valuable, that is a problem. There are better visualization tools that paint a clearer picture and allow for effective communication.
Loss of historical data.
Users in Excel are constantly updating the facts and data as necessary. The downfall to that is it essentially erases all historical data. Without historical data there is no clear way to see the trends and patterns. It takes away the ability to make predictions for the future.
It’s difficult to share.
Spreadsheets are not ideal for collaborative data sharing because they allow the risk of having data deleted or changed. The way that data is shared today is by emailing updated spreadsheets. This data is considered stale or dead, it lacks the key component of remaining “live” or in real-time. This way of sharing is not only time consuming but eliminates the opportunity for users to collaborate while never losing connection to the most updated information available.
The great news is, there’s an easy answer to all of the common frustrations of spreadsheets…
PowerOLAP is an example of a product developed with a solution that addresses all of these problems. It allows for real-time collaboration between users, while always remaining “live”. It has the ability to store historical data which allows for accurate analytical predictions to be reported. Take a deeper look into PowerOLAP and see how it can take your organization to the next level.
To read the entire article by Bernard Marr, click here.
Human Resources is beginning to pull ahead in the Business Analytics world. One thing is certain, if you want to run a successful business, you need to hire the right people to help you get there. Who is in charge of recruiting and hiring your team members? “DING, DING, DING!” you guessed it, Human Resources. In a “Big Data” world, HR can use “People Data” to their advantage and help businesses develop strategy when it comes to hiring the best candidates. As David Klobucher writes in his article, “Data-driven confidence will help HR professionals identify behaviors and interview styles that attract better employees, as well as qualities that make effective workers – and lead to faster promotions.”
I agree with Klobucher, this is a great time to be in HR. There are big opportunities that may be presented to anyone working in HR. Executives within businesses are looking to their Human Resources department to help build the strategy to success. Of course this all depends on if HR professionals “welcome” the business analytics technology with warm arms. As stated in the article, many individuals working with Human Resources are not completely comfortable using data just yet. In today’s world, Big Data surrounds all of us, but for HR, this can lead to big success from analyzing data of past successes and past failures.
In some HR departments, to take on this scope of technology could be intimidating, however like one of my favorite sayings goes, “I never said it would be easy, I only said it would be worth it.”
Read original article here.
Originally posted on October 26, 2015.
Be Mindful to Keep these Practices throughout your Business Intelligence Project
Getting Business Intelligence Best Practices to work well is challenging. Check out the 6 best practices outlined in this Infographic. Check it out and see how your organization stacks up. It’s easy to think that bigger is better, or assume you need to start from scratch. Actually its about having BI policies in place, keeping everyone on the team involved, having a clear scope, and making sure everyone is trained up well.
It is also important for a Business Intelligence project to have an analytics model that reflects the organization.
Who doesn’t want the largest return on their investment? The article written by Chris Twogood, How To Produce Ongoing Returns With Big Data, compares current Data and Analytics systems to what he claims is actually needed in today’s businesses.
He writes, and we agree, that Data within companies only continues to grow in volume and variety. Data solutions are developed to handle one issue at a time, which causes data to remain separated and replicated over-and-over again. It’s a prime example of a short term or one-off solution, and it can’t be the best way, right?
Twogood explains, the solution is to implement a sustainable, central data and analytics model that can ensure calculation of vital analyses. Because action and decision begin with the same question…What does the data tell us?
In this article, what stood out to me and truly symbolized what PARIS Tech, the sponsor of OLAP.com, is all about: “To satisfy the individual requests of the organization, an analyst or end user should be able to access data at any time to apply the analytics available.”
PARIS Tech makes it possible for organizations to share an analytical data model in real-time from different applications. Because, like Twogood says, users need to be able to access data at any time, and even real-time. With this concept front-and-center, organizations can see:
- A reduction in time to meet new needs
- Less complexity and cost of infrastructure
- Higher productivity
Click here to read the full article.
Dennis McCafferty of CIO Insight recently wrote an article that addresses 11 of the top practices of Business Intelligence. With Business Intelligence controlling such key factors in today’s companies such as, analytics, business performance management, text mining and predictive analytics, it is crucially important to understand it. Let’s take a look into CIO Insight’s 11 best practices and see if you are already taking advantage of these.
- Bigger Isn’t Always Better: Just because a solution can gather a large amount of data doesn’t mean that they are helping you get the most out of the data. McCafferty thinks that trustworthiness and immediacy are the key elements.
- Deliverable Value Over TCO: When your BI solution can deliver specific ROI, you will gain higher buy-in no matter the initial total cost of ownership.
- Take Stock of Current Resources: Taking advantage and leveraging the IT that your company already owns to support your BI solution is a top practice. You can then utilize that spending on something else that will make a larger impact.
- File-Formatting Resources: Since Business Intelligence uses more than 300 file formats, it is important that you are prepared and ready to use any one of them.
- Create BI Policies for Deployment: It is important to have BI policies in place such as how the data is collected, processed and stored. This will ensure higher level of relevance and accessibility.
- Go Team, Involve Business Leaders From the Outset: You need to remain on the same page as all of the different leaders and work as a big team to keep IT on the right path.
- The Only Constant? Change: Every thing is constantly changing and evolving so this will continue to test your BI deployment at all times.
- Limit Initial User Participation: It is better to start out slow and steady when introducing initial users. If not, it can lead to confusion, errors and confusion which will impact BI’s final impact.
- Define the Project’s Scope: A BI implementation should be taken in stages and a company must know how many users and functions will be needed over time.
- Training Day: In order for your BI project to be a success, you must take the right approach to training employees and make sure that they are properly educated and feel comfortable using the new solution.
- Support Self Service: The goal of BI is to pass along the project to the appropriate department. In order to do this you must support the training plans and keep this practice as a priority at all times.
Click here to read the original article.
Let’s take a look into some reasons that business intelligence projects tend to fail.
James Haight of Blue Hill Research recently wrote a blog post that breaks down the costs and numbers of Data Preperation. Typical reports focus on hours and efficiency. As stated by James Haight, “At Blue Hill, we recently published a benchmark report in which we made the case that dedicated data preparation solutions can produce significant efficiency gains over traditional methods such as custom scripts or Microsoft Excel.”
According to their study, data analysts spend on average about 2 hours per/day just on data preparation alone. When Blue Hill Research factored in the average salary that a data analyst makes in the U.S., it comes out to be around $62,000 per year. After doing the rough calculations, they figured out that 28% of a data analyst’s time is spent preparing data, which equals about $22,000 worth of their yearly salary. While that number seems high just considering one analyst, you can imagine how drastic that number looks when you figure in how many data analysts there can be at larger corporations. In this post, they breakdown the numbers even more. For example, say a company has 50 data analysts which is estimated that $1,100,000 is spent annually just on preparing data.
In order for data analysts to shift their full attention to where it should be, “analyzing the actual data”, there needs to be a solution implemented. PARIS Technologies has the solution. PowerOLAP is a software that was designed to take the stress and time out of preparing and comparing data. It has the capability to aggregate information from any source into a multidimensional PowerOLAP database. It empowers users the flexibility to slice and dice with ease. Learn more about how PowerOLAP could be the solution for your company facing this problem.
To read the Blue Hill Research post, click here.
Data planning is quickly becoming a top priority in businesses across the globe. Ben Rossi dives into some key components that are making it vital for organizations to manage their data. According to Rossi’s post, there are two main components that factor into this. The first one is the increasing amount of data that is being pulled into organizations for analysis. As time progresses, so does the high volume of data and it is only speeding up as time ticks forward. Large quantities of data and information is a great thing but in order to retain any value from it, it must be managed the correct way.
Organizations are being faced with tougher compliance policies which is requiring more effort in maintaining data for a much longer amount of time. Not only are businesses overflowing with large quantities of data but now must solve the issue of, where can all of this data be stored. Rossi provides the example of large credit card companies. In the past, they were required to keep the data records of all credit card transactions for seven years. But now there has been recent talk of extending that to 10 or possibly more years.
Data planning can have a big positive impact on a company as a whole, but planning is essential to success. The proper planning ensures that things such as cloud storage and prioritizing levels of data for storage within one’s network are all properly set up. Planning out the process and details for proper employee data access is crucial, too. It is important to figure out the limits and accessibility of data for all employees early on to ensure a positive work flow.
So, is it time to take a step back to re-evaluate just how effectively you are managing your data? What plan do you have in place and more importantly, has there been a positive impact on your business?
Want to read the entire article? Click Here.