OLAP and Excel

OLAP and Excel

 

The Power of OLAP and Excel
Should Excel be a key component of your company’s Business Performance Management (BPM) system? There’s no doubt how most IT managers would answer this question. Name IT’s top ten requirements for a successful BPM system, and they’ll quickly explain how Excel violates dozens of them. Even the user community is concerned. Companies are larger and more complex now than in the past; they are too complex for Excel. Managers need information more quickly now; they can’t wait for another Excel report. Excel spreadsheets don’t scale well. They can’t be used by many different users. Excel reports have many errors. Excel security is a joke. Excel output is ugly. Excel consolidation occupies a large corner of Spreadsheet Hell. For these reasons, and many more, a growing number of companies of all sizes have concluded that it’s time to replace Excel. But before your company takes that leap of faith, perhaps you should take another look at Excel. Particularly when Excel can be enhanced by an Excel-friendly OLAP database.That technology eliminates the classic objections to using Excel for business performance management.

Introducing OLAP
Excel-friendly OLAP products cure many of the problems that both users and IT managers have with Excel. But before I explain why this is so, I should explain what OLAP is, and how it can be Excel-friendly. Although OLAP technology has been available for years, it’s still quite obscure. One reason is that “OLAP” is an acronym for four words that are remarkably devoid of meaning: On-Line Analytical Processing. OLAP databases are more easily understood when they’re compared with relational databases. Both “OLAP” and “relational” are names for a type of database technology. Oversimplified, relational databases contain lists of stuff; OLAP databases contain cubes of stuff.

For example, you could keep your accounting general ledger data in a simple cube with three dimensions: Account, Division, and Month. At the intersection of any particular account, division, and month you would find one number. By convention, a positive number would be a debit and a negative number would be a credit. Most cubes have more than three dimensions. And they typically contain a wide variety of business data, not merely General Ledger data. OLAP cubes also could contain monthly headcounts, currency exchange rates, daily sales detail, budgets, forecasts, hourly production data, the quarterly financials of your publicly traded competitors, and so on.

You probably could find at least 50 OLAP products on the market. But most of them lack a key characteristic: spreadsheet functions.
Excel-friendly OLAP products offer a wide variety of spreadsheet functions that read data from cubes into Excel. Most such products also offer spreadsheet functions that can write to the OLAP database from Excel…with full security, of course.

Read-write security typically can be defined down to the cell level by user. Therefore, only certain analysts can write to a forecast cube. A department manager can read only the salaries of people who report to him. And the OLAP administrator must use a special password to update the General Ledger cube.

Other OLAP products push data into Excel; Excel-friendly OLAP pulls data into Excel. To an Excel user, the difference between push and pull is significant.

Using the push technology, users typically must interact with their OLAP product’s user interface to choose data and then write it as a block of numbers to Excel. If a report relies on five different views of data, users must do this five times. Worse, the data typically isn’t written where it’s needed within the body of the report. Instead, the data merely is parked in the spreadsheet for use somewhere else.

Using the pull technology, spreadsheet users can write formulas that pull the data from any number of cells in any number of cubes in the database. Even a single spreadsheet cell can contain a formula that pulls data from several cubes.

At first reading, it’s easy to overlook the significant difference between this method of serving data to Excel and most others. Spreadsheets linked to Excel-friendly OLAP databases don’t contain data; they contain only formulas linked to data on the server. In contrast, most other technologies write blocks of data to Excel. It really doesn’t matter whether the data is imported as a text file, copied and pasted, generated by a PivotTable, or pushed to a spreadsheet by some other OLAP. The other technologies turn Excel into a data store. But Excel-friendly OLAP eliminates that problem, by giving you real-time data for a successful BPM system.

To learn more about OLAP, click here.

Excel is Not a one Stop Shop for your Data Needs

Excel, not a one stop shop

“There’s nothing inherently wrong with spreadsheets; they’re excellent tools for many different jobs. But data visualization and data communication is not one of them.” – Bernard Marr

We couldn’t agree more with what Bernard is saying in his article, “Why You Must STOP Reporting Data in Excel!” Excel is everywhere and it has proven to be a valuable resource to every company across the globe. The problem is that many companies are using spreadsheets as their main line of communication internally. Excel is great at displaying all of the raw data you could possibly dream of, just ask any Data Analyst, who eats, sleeps and dreams of never-ending spreadsheets. Bernard gets right to the point and lays out the top 4 reasons that spreadsheets are not the right fit for visualizing data and communication within an organization.

Most people don’t like them.

Bernard makes a great point, unless you work with Excel frequently like a data analyst, it has the reputation of being intimidating. Employees will be reluctant to use it, let alone even think about analyzing data from it. If employees are not clerking in Excel all day, they are most likely going to give Excel the cold shoulder when it comes to communicating data.

Important data is hidden.

I think it is safe to agree with Bernard on this. Spreadsheets are not the best visualization tool out there. Most spreadsheets today are full of endless numbers. If users can’t look at the data and quickly decipher valuable vs. non-valuable, that is a problem. There are better visualization tools that paint a clearer picture and allow for effective communication.

Loss of historical data.

Users in Excel are constantly updating the facts and data as necessary. The downfall to that is it essentially erases all historical data. Without historical data there is no clear way to see the trends and patterns. It takes away the ability to make predictions for the future.

It’s difficult to share.

Spreadsheets are not ideal for collaborative data sharing because they allow the risk of having data deleted or changed. The way that data is shared today is by emailing updated spreadsheets. This data is considered stale or dead, it lacks the key component of remaining “live” or in real-time. This way of sharing is not only time consuming but eliminates the opportunity for users to collaborate while never losing connection to the most updated information available.

The great news is, there’s an easy answer to all of the common frustrations of spreadsheets…

PowerOLAP is an example of a product developed with a solution that addresses all of these problems. It allows for real-time collaboration between users, while always remaining “live”. It has the ability to store historical data which allows for accurate analytical predictions to be reported. Take a deeper look into PowerOLAP and see how it can take your organization to the next level.

To read the entire article by Bernard Marr, click here.

 

28% of a Data Analyst’s Time is Spent on Data Prep

Data preparation

 

James Haight of Blue Hill Research recently wrote a blog post that breaks down the costs and numbers of Data Preperation. Typical reports focus on hours and efficiency. As stated by James Haight, “At Blue Hill, we recently published a benchmark report in which we made the case that dedicated data preparation solutions can produce significant efficiency gains over traditional methods such as custom scripts or Microsoft Excel.”

According to their study, data analysts spend on average about 2 hours per/day just on data preparation alone. When Blue Hill Research factored in the average salary that a data analyst makes in the U.S., it comes out to be around $62,000 per year. After doing the rough calculations, they figured out that 28% of a data analyst’s time is spent preparing data, which equals about $22,000 worth of their yearly salary. While that number seems high just considering one analyst, you can imagine how drastic that number looks when you figure in how many data analysts there can be at larger corporations. In this post, they breakdown the numbers even more. For example, say a company has 50 data analysts which is estimated that $1,100,000 is spent annually just on preparing data.

In order for data analysts to shift their full attention to where it should be, “analyzing the actual data”, there needs to be a solution implemented. PARIS Technologies has the solution. PowerOLAP is a software that was designed to take the stress and time out of preparing and comparing data. It has the capability to aggregate information from any source into a multidimensional PowerOLAP database. It empowers users the flexibility to slice and dice with ease. Learn more about how PowerOLAP could be the solution for your company facing this problem.

 

To read the Blue Hill Research post, click here.

Big Data & Using The Right Analytics For The Business Problem

 Big Data

 

Do you agree with the phrase, “less is more?”

We hear that phrase a lot, but what does it actually mean in the Business Intelligence/Big Data world? In the article, Big Data Breakdown: Use the Right Analytics for the Business Problem, the author gives a great example of how that phrase, “less is more,” stands true.  Meta (great name!) S. Brown points out that in today’s business world, many are wrapped up in the thought of, “the more data the better.” But in actuality, to gain the most return on your investment, the key is to have just the “right” amount of data to solve your problem.  Interesting that Brown states that many businesses actually need only between 1%-10% of the amount of data they are currently collecting. Maybe this is something that businesses need to start taking a closer look at, namely, “can collecting too much data be doing more harm than good?”

Check out the original article here.

Advice for CFOs: Invest in New Technology

Top Technology Trends for Today’s CFO’s” is another insightful post from a blogger we frequently feature, Timo Elliott. In it he admits that the CFO relationship with the CEO and other business executives leaves something to be desired.  He recommends that CFOs invest in the latest technology, which will increase productivity with real-time updates and continuous forecasting.

cfos-and-speed-608x456

{Image from Timo’s post, link to http://timoelliott.com/blog/2015/07/top-technology-trends-for-todays-cfos.html}

Elliott mentions a combination of new technology including: in-memory computing, big data, the cloud, and mobile.

He homes in on a key point—that finance staff at large companies are extremely bogged down with just the basics of maintaining their financial reports. As Elliott puts it, “Staff have to spend too much time on basic duties and have no time to improve their understanding of the operational measures that drive and impact financial measures.” This lack of insight or understanding of how the operational measures drive and impact financial measures is the root of the relationship problem between CFOs and other business executives.

Elliott suggests new in-memory computing technology because, “they reduce complexity by combining real-time actuals with budgeting and analysis in a single, integrated system. Financial data is stored just once, making almost every aspect of financial operations faster, simpler, cheaper, and more effective.” We couldn’t agree more, as developers of a new in-memory technology ourselves.

The result of improved systems, improved speed, and better data is ultimately a better working relationship between business executives, and a more productive, effective workplace.

Read Timo Elliott’s post here

olation_ad2

 

Infographic: Why BI is Key for Competitive Advantage

A great infograhic from the Master of Science in Computer Information Systems program at Boston University.  The BU researchers focused on: growth of business intelligence, management of data, decision-making and budgeting.  Enjoy!

 

BU-BusinessIntelligence-Is-Key

{Originally posted here}