OracleBIBlog Search

Wednesday, October 21, 2009

Oracle OpenWorld, A Huge Success

Finally back from OpenWorld and a short vacation. OpenWorld this year was another great success for BI Consulting Group, unfortunately I missed the Appreciation Event but it sounds like the Team had a great time! I take my hat off to the BICG marketing team for coordinating such a great event. We had a ton of visitors stop by our booth wanting to know more about our organization and how we can help them. Our nerf footballs were a huge success, it was a good thing we received an additional order as we went through almost 750 the first day!

The OPN Partner Forum on Sunday was very informational and it was great to hear Charles Phillips speak. It's nice to hear how much Oracle values the relationships it has with its partners and just how crucial we are to their success. Specialization seems to be one of the key focuses this year, and we are in a great position to serve that with our exclusive focus on Oracle BI and EPM.

While OpenWorld can be exhausting it's one of the highlights of the year and I can't wait to do it all over again next year.

Reflections on a Different OpenWorld

Now that I've had a few days to get back into the groove (and catch-up on emails), I find myself looking back on OpenWorld from a fifty-thousand-foot perspective. If you've ever been there, you know that OpenWorld is a lot like a wedding - it goes so fast, there's a lot of chaos, and it's over before you know it. If you're an Oracle partner, that analogy is even more true given how much planning goes into being an exhibitor, speaker, and more (hats-off to our Marketing group for another great year).


So as I look back on my OpenWorld experience, I can honestly say that this year felt very different from past years. True, we had a typhoon in the middle of the week, and you could tell that attendance was a little down, but it was more than that: Oracle people were talking the same language.

That may sound like a funny comment to make, but the reality is that Oracle has been on an acquisition spree for quite a while, especially around EPM, BI and Data Warehousing. At past OpenWorlds, you could tell that, despite having successfully integrated the acquired companies into their P&L, they hadn't ingrained everyone with both the big picture and all the little pictures. Oracle Employees still spoke in their native tongue, so to speak.

But attending sessions in the EPM/BI track and talking to PMs, it's clear that Oracle has turned the corner in this regard. People are giving the same answers and a tap-dance doesn't accompany them. This is a good thing, because it means that the software is following suit.

Despite the repetitive revenue recognition slide (everyone chant: "sometime in the next 12 months"), Oracle was very forthcoming about what's coming and how it's all going to fit together - at a very detailed level. Not everything is known nor shared at this point, but for the first time we got to see live examples of things like XML Publisher (BI Publisher) and OBIEE converging further, Interactive Reporting(IR)/SQR and XML Publisher converging, IR and OBIEE converging, Essbase and OBIEE converging, how to use and position ODI and Golden Gate, the convergence of OWB and ODI, why OBIEE and ODI are part of Middleware, and much much more. None of this was major news - we've heard the roadmap presentation before - but you could actually see it coming together and, more importantly, everyone you talked to or heard from was saying the same things, giving the same answers, talking the same language.

I can recall hearing at OpenWorld last year that Oracle had over 9,000+ products in their software pool and what a mess that must be to bring synergy to small subsets of them. EPM/BI has felt that way for the last 2-3 years, but 2009 marks a change. Looking into the future, we can finally start to see the beacon growing on the horizon. The vision is getting closer and closer to becoming reality. 2010 should be a fun year!

Thursday, October 15, 2009

Bubble Chart Tips

I will never forget while as a business analyst working directly with Senior Leadership, I was shown a funny looking graphical chart that took up half a page of a USA Today business page and was asked to replicate for the business function I was working for. As an analyst I already had the innate ability to look at charts, tables, etc. and decipher their meaning for the business to take action. This bubble chart stumped me. It took me an hour to figure what the chart meant and then another day to replicate it for this senior leader in Excel, only to find out after bringing to him that he wanted the measures on completely different axis then what was presented. This is when I fell in love with database analytics tools like OBIEE. OBIEE takes a simplistic approach in creating a bubble chart while giving you flexibility to make it dynamic for your end user.

A typical line or bar chart has two axis’s: x (the bottom axis of the chart) and y (the left axis of the chart). The bubble chart has a 3rd axis called the z-axis, which highlights the most meaningful piece of data that your viewer is visually searching for. Below are 3 examples using Sales Units, Sales Dollars and Price per Unit to view in a bubble chart within OBIEE.

Example 1: The following chart places the Price per Unit in the z-axis. Notice how the sizes of the bubbles are approximately the same size. This chart shows that each region is selling the product around the same price per unit, however, the Eastern Region is selling at a higher volume therefore having higher sales. The Eastern Region is noticeable as an outlier because of its placement within the chart. Configuring the Bubble Chart with little to no disparity of the z-axis value doesn’t draw attention to the size of the bubbles, visually this may not be a proper use of a Bubble Chart. A scatter chart may be better suited for this type of configuration.


Example 2: The following chart places Units in the z-axis. With the Price per Unit on the y-axis you can see how the price per unit of the products have smaller range (this is why the bubbles were close to the same size above). This particular bubble chart actually shows that the Eastern Region has a higher volume than the other region based off of the size of the bubble.


But what if you wanted to view the chart using your company’s three key measures for the region in the format specific to your needs?

Example 3: By utilizing the column selector feature in OBIEE, you can create a dynamic view of how you want the placement of your businesses key metrics within the bubble chart. Remember to uncheck the ‘Automatically refresh when a new column is selected’ so that the chart does not change while selecting the measures for the proper axis.


Recap on Oracle OpenWorld




Today wraps another Oracle Open World. Outside of some extreme weather on Tuesday that had everyone scrambling for umbrellas, the event went off without a hitch. Many of the sessions provided real world insight to customer deployments and technical details, but it was difficult to differentiate these from the sales & marketing presentations just by the session title and description.



Of greatest interest was finally seeing Oracle’s holistic vision for Integrated Analytics to serve ERP, CRM, EPM and Industry Application needs and understanding where the Analytical Foundation fits in the mix. Oracle is hitting the mark. As for the Customer Appreciation night, I’ll just say that Carnival Games are harder than they look and Steven Tyler can still rock the house!






Monday, October 12, 2009

Follow BICG On Twitter

BICG is now posting to Twitter... get real-time updates on any new blogs, whitepapers, videos, photos, and events.

http://twitter.com/BICG

Friday, October 9, 2009

BI Consulting Group is Ready for Oracle OpenWorld

Visit our booth at Moscone South, Booth 2201 - located FRONT ROW to the RIGHT!

Put the following BICG Events at OpenWorld on your calendar:

* Monday, October 12th, 10:45am - 11:15am
~ Spotlight on Oracle BI at PNC Bank
~ Moscone South, BICG Exhibit #2201

* Monday, October 12th, 12:45pm - 1:15pm
~ Introduction to Oracle Exadata
~ Moscone South, BICG Exhibit #2201

* Monday, October 12th, 3:45pm - 4:15pm
~ Introduction to BICG Identity
~ Moscone South, BICG Exhibit #2201

* Monday, October 12th, 7pm - 9pm
~ BICG Happy Hour featuring Insight Awards
~ Ponzu Lounge, 401 Taylor Street

* Tuesday, October 13th, 10:45am - 11:25am
~ Spotlight on Oracle BI at LCRA
~ Moscone South, BICG Exhibit #2201

* Tuesday, October 13th, 3:45pm - 4: 15pm
~ Conducting Gap Analysis with BICG Impact
~ Moscone South, BICG Exhibit #2201

* Tuesday, October 13th, 5:30pm - 6:30pm
~ Procurement and Spend Analytics at Los Alamos National Laboratory
~ Marriott Hotel, Salon 12/13

* Wednesday, October 14th, 10:15am - 11:15am
~ Maximizing Oracle Business Intelligence Adoption: Amway, LCRA, and PNC Bank
~ Moscone South, Room 310


* Wednesday, October 14th, 12:45pm - 1:15pm
~ Migrate from Interactive Reporting/Brio to OBIEE the Smart Way
~ Moscone South, BICG Exhibit #2201

* Wednesday, October 14th, 2:45pm - 3:15pm
~ Spotlight on Oracle BI at Motorola
~ Moscone South, BICG Exhibit #2201

Additional information can be found at www.BIConsultingGroup.com/OOW.

Open Source Business Intelligence

We all love Open Source software, right? Apache, Java, Linux, Eclipse... What would our profession be without it?

But with the growing maturity of Open Source Business Intelligence solutions, perhaps the right question we should be asking is: What will our profession be WITH it?

Several factors are pointing to the potential near-term growth of Open Source BI solutions in the marketplace.

First is a wider acceptance of Open Source solutions in general among the business community. The Open Source Peace Prize for this effort should go to Linux. Every year more businesses are finding that Linux-based servers are proving their nettle in terms of stability and performance quality. Furthermore, as a legitimizing move for Open Source BI in particular, Gartner this year for the first time decided to invite two of the biggest names in Open Source BI - Jaspersoft and Pentaho, which Gartner considers "viable players in the BI platform market" - to participate in their Magic Quadrant study of BI vendors. And, judging from the fact that traditional BI vendor MicroStrategy is marketing a free limited-use version of its ReportingSuite software in obvious response to Open Source alternatives (try Googling "Open Source Business Intelligence" and note the resulting Sponsored Link), the Open Source "threat" has also clearly affected marketing strategies of traditional vendors.

But before we go any further, let's be clear about the specific definition of Open Source. A common misperception outside of the developer community is that "Open Source" simply means "free." This is not necessarily the case. The more important distinction is that the underlying source code of the application is not only visible to the community but is also readily modifiable, so that members of the community can contribute their own changes in the form of improvements or bug fixes to that code. Hence "Open" (visible & modifiable) "Source" (source code). There are legal nuances to the different kinds of licensing reserved by authors of Open Source intellectual property but we can save that for a different discussion.

In the Linux case, while it is true that some variations of Linux are in fact "free," wise corporations who invest in Linux will almost always opt for the "Redhat" flavor that requires a paid subscription fee whose real value is access to tech support and upgrades - not to mention supportability by certain vendors of software that are hosted on Linux operating systems.

The Open Source paradigm - not just software but the concept of community-supported development - presents both threats and opportunities to the Oracle Business Intelligence community. I'd like to share some that have come to my mind.

Threats

* Slashed software budgets - With less software spending power, the lower acquisition cost of Open Source options will entice tech-savvy organizations, particularly SMBs, to evaluate Open Source solutions alongside traditionally licensed options. The sales strategy of traditional solutions will need to consider this challenger more seriously.

* Given the reality that any Open Source technology matures with every implementation, the result is a mathematical effect - perhaps OS may not be a viable solution for certain sectors now, but in one year, who knows? Within two years we will almost certainly see greater acceptance.

Opportunities

* Slashed IT staffing budgets improves the attractiveness of pre-built & supported solutions like Oracle's BI Apps against Open Source options, which by their always-in-development nature will require more internal staff support.

* The demonstrated success of Open Source subscription-based business models together with wider acceptance of "Cloud computing" for the Enterprise portends the viability and popularity of BI solutions provided under a Software-as-a-Service (SaaS) model.

* Oracle's acquisition of Sun - With MySQL officially within its corporate portfolio, Oracle may need to deal with this hugely popular Open Source database one way or the other. Either Oracle will maintain a kind of status quo and frame MySQL as a fringe product best used for scrappy startups, or it will recognize MySQL's value as a kind of "gateway drug" that provides an easy path to Oracle's standard licensed database products. In that scenario we could see MySQL cultivated and eventually supported as a target warehouse provider.

* Open Source methodologies can be applied to the development of ancillary components for traditional BI solutions. In the case of Oracle BI, there's no reason why we can't host a library of Open-Source helper apps, like Java applets or JSP pages or JavaScript routines that can perform a variety of functions when invoked within the OBIEE platform, or even XML renderings of OBIEE reports themselves that demonstrate clever tricks of the technology.

This latest opportunity excites me the most, as I believe it could serve to unify our community and provide tangible means for us to build our credibility. The biggest barrier to this effort will be a natural bias against giving away valuable intellectual property, especially when that IP serves as differentiator among BI consultancies. On the other hand, in an Open Source ecosystem, the contributions of a company or individual to the Oracle BI Open Source collective will serve to demonstrate and therefore legitimize their technical abilities far more powerfully than any marketing material, sales pitch or dare I say even a series of intelligent and insightful blog posts :-). As this effort is more focused on the community overall, it makes sense to see the community take on this initiative. (Hear that, UGOBI?)

In the spirit of "Open Sourcing" my thought processes, I remain eager to hear your feedback. What threats or opportunities do you see?

Here are some interesting articles on the topic for further reading:

Commercial Open Source BI Redux
http://www.dashboardinsight.com/articles/digital-dashboards/fundamentals/commercial-open-source-bi-redux.aspx
Penned by founders of OpenBI, this article reviews the current state and future opportunities of Open Source BI. A quote: "We wouldn't be surprised if Cloud vendors Amazon and Google started offering OSBI platform capabilities for free to customers contracting for their servers and storage."

Who Is Using Open Source Business Intelligence, and Why
http://www.itbusinessedge.com/cm/community/features/interviews/blog/who-is-using-open-source-business-intelligence-and-why/?cs=35889
Interview with Mark Madsen, founder and president of BI consultancy firm Third Nature and author of a recent study of open source adoption in the business intelligence and data warehousing markets.

Magic Quadrant for Business Intelligence Platforms
http://mediaproducts.gartner.com/reprints/sas/vol5/article8/article8.html
Gartner's opinion of the main software vendors that should be considered by organizations seeking to develop business intelligence (BI) applications. "This year Gartner gave serious consideration to including open-source BI suppliers in the Magic Quadrant, and even altered the inclusion criteria to allow for this eventuality. As yet, though, no open-source BI platform supplier generates enough revenue to be included in the Magic Quadrant [$20m in revenue]. However, while they don't meet the revenue requirement, Jaspersoft and Pentaho have emerged as viable players in the BI platform market and as such we invited these firms to take part in the Magic Quadrant user survey."

Sites

BeyeNetwork - Specifically the Open Source "Channel"
http://www.b-eye-network.com/channels/1405

Vendors

Jaspersoft
http://jaspersoft.com/

Pentaho
http://www.pentaho.com/

Microstrategy
(again, not truly Open Source, but offers free limited licensing of proprietary ReportingSuite)
http://www.microstrategy.com/freereportingsoftware/index.asp?gclid=cnhim4p3q50cfsreagodzw3-ja

Data Quality

Data Quality:


How often after the implementation of a Business Intelligence (BI) Project, have you heard that the business users do not feel the data is reliable, credible, and consistent to meet their analysis and reporting needs. Unfortunately that is too often the response from a client after a BI Project is implemented. As pointed out by Ralph Kimball in his book “The Data Warehouse Toolkit”, the business community must accept the data warehouse if it is deemed to be successful. The other goals of the data warehouse are:

1. The data warehouse must make an organization’s information easily accessible
2. The data warehouse must present the organization’s information consistently
3. The data warehouse must be adaptive and resilient to change
4. The data warehouse must be a secure bastion that protects our information
5. The data warehouse must serve as the foundation for improved decision making

He further states that you can have the most technical sound data warehouse; but if the business community does not accept the data warehouse as adding value it is a failed program. One of the major reasons that a data warehouse is not accepted by the business community is the perception of poor quality of the data in the data warehouse that the user accesses. There are many reason why the data quality is perceived as poor, but one of the ways to discover the quality of the data is to conduct a data analysis in the early phase of the project – I discussed this briefly last week.

So where does data quality begin. Many point out the Database Administrator or the Information Technology staff as the cause of poor data quality. However, since the data is a corporate asset the responsibility for the data quality belongs to the whole organization. This is starting to be recognized by many corporations as Master Data Management and Customer Data Integration processes have been started within some organizations. Two of the most prominent causes of poor data quality were:

1. Movement of centralized data system to distributed data systems
2. Poor implantation of purchased package data systems
3. Silo implementation of purchased package data systems
4. Lack of data edits for imputing data into data systems
5. Lack of having a system of record for corporate entities
6. Not viewing data entities from a corporate perspective

So what can be done in the short term to help corporations implement Business Intelligence until they can get a Master Data Management and Data quality processes implemented . Some of the short term steps that can be done:

1. Begin data analysis process early in the program to help determine the quality and consistently of the data
2. Work with the business users to find short term solutions to the data quality issues
3. Work with the business users to determine data edits for the data elements
4. Work with the business users to determine the definition and calculation of major data metrics
5. Work with the business users to determine a system of record for the major entities
6. Work with the business users to determine major data hierarchies
7. Work with the business users to determine an acceptable level of data quality for the project within the current BI Program
8. Work with the business users to develop a data repository for a corporate definition of data elements and metrics
9. Work with the business users to determine acceptable level of analysis and reporting requirements
10. Keep the business users involved in all phases of the project development lifecycle

Data quality is an issue facing all data warehouse and Business Intelligence Programs. It should be addressed early in the BI Program, and be resolved in the short term with input from the business users. The business users are the ones that have to perceive that BI Program as adding values otherwise it will not be successful. They have to be involved in all phases of the BI Program to understand the data issues and develop short term solutions to the data quality and definition problems until a corporate Master Data Management and Data Quality Program can be implemented.

Sunday, October 4, 2009

Data Analysis/Validation

Data Analysis/Validation:

Data is the heart of all Business Intelligence projects, but it is often the most missed step in many Business Intelligence projects. Most often the data analysis/validation steps are preformed in the testing phase after the design and build are already completed. Often when the dashboards and reports are compared to their existing reports there are differences between the values on the two reports. Hence, a long, detail data analysis/validation process starts which often cause the project time lines be project budget to be exceeded.
A better approach is to start the data analysis/validation in the project requirements gathering phase. Once access is provided to the application database, it is very beneficial to write queries against the source system to validate the measures on the existing reports. Also by writing queries against the source system the data calculation can be checked and validated. Many of the existing user’s reports may have been built by a resource that is no longer with the organization. Additionally be doing the data analysis/validation early in the project you can get a feel for the data quality. Small data quality errors in a transaction system will be magnified many times in the data warehouse. A small error on one record will greatly be magnified when you are looking at millions of records.

There is often an argument that there is no time in the project requirements phase to perform the data analysis/validation steps. However, if it is not performed in the early phases of the project the time and cost to perform the data analysis/validation are more expensive. There is an old project truism that puts data analysis/validation in perspective: if it is preformed in the requirements phase is cost $1, if it is performed in the design phase it costs $10, If it is performed in the build phase it costs $100, if it is performed in the testing phase it cost $500, and if it is done in the production phase it costs $1000. From this perspective it is more effective and cost efficient to perform the data analysis/validation in the early part of the project where the cost and time constraints and less costly.