Infographic: Business Analyst vs. Data Scientist

Business Analytics vs. Data Science

“Business analytics” and “data science” — are they basically interchangeable terms, or entirely separate professional pursuits? There’s certainly overlap on the topic of Big Data and using data to inform decisions. There is no dispute over the fact that both business analysts and data scientists use exponentially growing sources of data to do their work. [Check out PARIS Tech’s recent post on Big Data]

An article and featured infographic by Angela Guess for Dataversity.net argues that the terms business intelligence and data scientist are distinct, and not just because one pursuit applies to business, and the other to scientific results.

infographic-business-analyst-vs-data-science

Click below to read the original article which accompanies the business intelligence vs. data scientist infographic.

Infographic: Business Analytics v. Data Science

 

Seeing Value in OLAP Anew: OLAP and Hadoop

olap-and-hadoop

OLAP and Hadoop: A Great Pairing

OLAP continues to be a relevant and exciting technology, most recently in pairing OLAP and Hadoop. As we are OLAP.com, we have ALWAYS seen the value of OLAP technology. We admit OLAP has been a bit out of style the last few years. Some companies even run Google ads about how “OLAP is obsolete,” but nothing could be further from the truth. (Check out our blog on that one.)

We see this in the fashion industry all the time: what is old is new again! This is rare in the technology realm, but it seems to be the case with OLAP. As developers struggle to get value out of Hadoop data, they discovered they needed the speed and flexibility of OLAP. OLAP and Hadoop is a powerful combination for getting to the ultimate goal of extracting value from Big Data.

Bringing OLAP to scale for Big Data

In an article from ZDNet, Is this the age of Big OLAP? Andrew Brust writes about the new relationship between OLAP and Hadoop. He highlights that OLAP technology can be particularly beneficial when working with extremely large Big Data sets. Typically, OLAP has not been scalable enough for Big Data solutions. But OLAP technology continues to progress, we find this new application of OLAP exciting. Brust discusses a few strategies for bringing the two technologies together. He mentions a few OLAP vendors in detail and how they manage the issue of scalability for OLAP software.

If you want to try using OLAP with Hadoop, perhaps you want to give PowerOLAP, the mature OLAP product of OLAP.com, a try? There is a free version of PowerOLAP available. If you plan to test PowerOLAP with your Hadoop, contact PARIS Tech, and they will lift the member limit for you in the free version, as you will need to go beyond the member limit that ships with the free version.

In sum, OLAP.com is pleased to see OLAP rising in relevance once again and getting some of the recognition we felt it deserved all along. It is a testament to the power and value OLAP has as a technology.

OLAP and Excel

OLAP and Excel

 

The Power of OLAP and Excel
Should Excel be a key component of your company’s Business Performance Management (BPM) system? There’s no doubt how most IT managers would answer this question. Name IT’s top ten requirements for a successful BPM system, and they’ll quickly explain how Excel violates dozens of them. Even the user community is concerned. Companies are larger and more complex now than in the past; they are too complex for Excel. Managers need information more quickly now; they can’t wait for another Excel report. Excel spreadsheets don’t scale well. They can’t be used by many different users. Excel reports have many errors. Excel security is a joke. Excel output is ugly. Excel consolidation occupies a large corner of Spreadsheet Hell. For these reasons, and many more, a growing number of companies of all sizes have concluded that it’s time to replace Excel. But before your company takes that leap of faith, perhaps you should take another look at Excel. Particularly when Excel can be enhanced by an Excel-friendly OLAP database.That technology eliminates the classic objections to using Excel for business performance management.

Introducing OLAP
Excel-friendly OLAP products cure many of the problems that both users and IT managers have with Excel. But before I explain why this is so, I should explain what OLAP is, and how it can be Excel-friendly. Although OLAP technology has been available for years, it’s still quite obscure. One reason is that “OLAP” is an acronym for four words that are remarkably devoid of meaning: On-Line Analytical Processing. OLAP databases are more easily understood when they’re compared with relational databases. Both “OLAP” and “relational” are names for a type of database technology. Oversimplified, relational databases contain lists of stuff; OLAP databases contain cubes of stuff.

For example, you could keep your accounting general ledger data in a simple cube with three dimensions: Account, Division, and Month. At the intersection of any particular account, division, and month you would find one number. By convention, a positive number would be a debit and a negative number would be a credit. Most cubes have more than three dimensions. And they typically contain a wide variety of business data, not merely General Ledger data. OLAP cubes also could contain monthly headcounts, currency exchange rates, daily sales detail, budgets, forecasts, hourly production data, the quarterly financials of your publicly traded competitors, and so on.

You probably could find at least 50 OLAP products on the market. But most of them lack a key characteristic: spreadsheet functions.
Excel-friendly OLAP products offer a wide variety of spreadsheet functions that read data from cubes into Excel. Most such products also offer spreadsheet functions that can write to the OLAP database from Excel…with full security, of course.

Read-write security typically can be defined down to the cell level by user. Therefore, only certain analysts can write to a forecast cube. A department manager can read only the salaries of people who report to him. And the OLAP administrator must use a special password to update the General Ledger cube.

Other OLAP products push data into Excel; Excel-friendly OLAP pulls data into Excel. To an Excel user, the difference between push and pull is significant.

Using the push technology, users typically must interact with their OLAP product’s user interface to choose data and then write it as a block of numbers to Excel. If a report relies on five different views of data, users must do this five times. Worse, the data typically isn’t written where it’s needed within the body of the report. Instead, the data merely is parked in the spreadsheet for use somewhere else.

Using the pull technology, spreadsheet users can write formulas that pull the data from any number of cells in any number of cubes in the database. Even a single spreadsheet cell can contain a formula that pulls data from several cubes.

At first reading, it’s easy to overlook the significant difference between this method of serving data to Excel and most others. Spreadsheets linked to Excel-friendly OLAP databases don’t contain data; they contain only formulas linked to data on the server. In contrast, most other technologies write blocks of data to Excel. It really doesn’t matter whether the data is imported as a text file, copied and pasted, generated by a PivotTable, or pushed to a spreadsheet by some other OLAP. The other technologies turn Excel into a data store. But Excel-friendly OLAP eliminates that problem, by giving you real-time data for a successful BPM system.

To learn more about OLAP, click here.

Excel is Not a one Stop Shop for your Data Needs

Excel, not a one stop shop

“There’s nothing inherently wrong with spreadsheets; they’re excellent tools for many different jobs. But data visualization and data communication is not one of them.” – Bernard Marr

We couldn’t agree more with what Bernard is saying in his article, “Why You Must STOP Reporting Data in Excel!” Excel is everywhere and it has proven to be a valuable resource to every company across the globe. The problem is that many companies are using spreadsheets as their main line of communication internally. Excel is great at displaying all of the raw data you could possibly dream of, just ask any Data Analyst, who eats, sleeps and dreams of never-ending spreadsheets. Bernard gets right to the point and lays out the top 4 reasons that spreadsheets are not the right fit for visualizing data and communication within an organization.

Most people don’t like them.

Bernard makes a great point, unless you work with Excel frequently like a data analyst, it has the reputation of being intimidating. Employees will be reluctant to use it, let alone even think about analyzing data from it. If employees are not clerking in Excel all day, they are most likely going to give Excel the cold shoulder when it comes to communicating data.

Important data is hidden.

I think it is safe to agree with Bernard on this. Spreadsheets are not the best visualization tool out there. Most spreadsheets today are full of endless numbers. If users can’t look at the data and quickly decipher valuable vs. non-valuable, that is a problem. There are better visualization tools that paint a clearer picture and allow for effective communication.

Loss of historical data.

Users in Excel are constantly updating the facts and data as necessary. The downfall to that is it essentially erases all historical data. Without historical data there is no clear way to see the trends and patterns. It takes away the ability to make predictions for the future.

It’s difficult to share.

Spreadsheets are not ideal for collaborative data sharing because they allow the risk of having data deleted or changed. The way that data is shared today is by emailing updated spreadsheets. This data is considered stale or dead, it lacks the key component of remaining “live” or in real-time. This way of sharing is not only time consuming but eliminates the opportunity for users to collaborate while never losing connection to the most updated information available.

The great news is, there’s an easy answer to all of the common frustrations of spreadsheets…

PowerOLAP is an example of a product developed with a solution that addresses all of these problems. It allows for real-time collaboration between users, while always remaining “live”. It has the ability to store historical data which allows for accurate analytical predictions to be reported. Take a deeper look into PowerOLAP and see how it can take your organization to the next level.

To read the entire article by Bernard Marr, click here.

 

How to get the Largest Return from your Big Data

 

Businessman drawing ROI (return on investment) with graphs

Who doesn’t want the largest return on their investment? The article written by Chris Twogood, How To Produce Ongoing Returns With Big Data,  compares current Data and Analytics systems to what he claims is actually needed in today’s businesses.

He writes, and we agree, that Data within companies only continues to grow in volume and variety. Data solutions are developed to handle one issue at a time, which causes data to remain separated and replicated over-and-over again. It’s a prime example of a short term or one-off solution, and it can’t be the best way, right?

Twogood explains, the solution is to implement a sustainable, central data and analytics model that can ensure calculation of vital analyses. Because action and decision begin with the same question…What does the data tell us?

In this article, what stood out to me and truly symbolized what PARIS Tech, the sponsor of OLAP.com, is all about: “To satisfy the individual requests of the organization, an analyst or end user should be able to access data at any time to apply the analytics  available.”

PARIS Tech makes it possible for organizations to share an analytical data model in real-time from different applications. Because, like Twogood says, users need to be able to access data at any time, and even real-time. With this concept front-and-center, organizations can see:

  • A reduction in time to meet new needs
  • Less complexity and cost of infrastructure
  • Higher productivity

Click here to read the full article.

11 of the Best Practices for Business Intelligence

Business Team Brainstorming Data Target Financial Concept

Dennis McCafferty of CIO Insight recently wrote an article that addresses 11 of the top practices of Business Intelligence. With Business Intelligence controlling such key factors in today’s companies such as, analytics, business performance management, text mining and predictive analytics, it is crucially important to understand it. Let’s take a look into CIO Insight’s 11 best practices and see if you are already taking advantage of these.

  1.  Bigger Isn’t Always Better: Just because a solution can gather a large amount of data doesn’t mean that they are helping you get the most out of the data. McCafferty thinks that trustworthiness and immediacy are the key elements.
  2. Deliverable Value Over TCO: When your BI solution can deliver specific ROI, you will gain higher buy-in no matter the initial total cost of ownership.
  3. Take Stock of Current Resources: Taking advantage and leveraging the IT that your company already owns to support your BI solution is a top practice. You can then utilize that spending on something else that will make a larger impact.
  4. File-Formatting Resources: Since Business Intelligence uses more than 300 file formats, it is important that you are prepared and ready to use any one of them.
  5. Create BI Policies for Deployment: It is important to have BI policies in place such as how the data is collected, processed and stored. This will ensure higher level of relevance and accessibility.
  6. Go Team, Involve Business Leaders From the Outset: You need to remain on the same page as all of the different leaders and work as a big team to keep IT on the right path.
  7. The Only Constant? Change: Every thing is constantly changing and evolving so this will continue to test your BI deployment at all times.
  8. Limit Initial User Participation: It is better to start out slow and steady when introducing initial users. If not, it can lead to confusion, errors and confusion which will impact BI’s final impact.
  9. Define the Project’s Scope: A BI implementation should be taken in stages and a company must know how many users and functions will be needed over time.
  10. Training Day: In order for your BI project to be a success, you must take the right approach to training employees and make sure that they are properly educated and feel comfortable using the new solution.
  11. Support Self Service: The goal of BI is to pass along the project to the appropriate department. In order to do this you must support the training plans and keep this practice as a priority at all times.

 

Click here to read the original article.