lunes, 14 de noviembre de 2016

Finally! SAP BW A Dynamic Star Schema!

For years, we (BW guys) have used the classical BW schema:

  • Multiple layers, and data is moved from layer to layer
  • If you need something new (add a field, etc.), usually you have to touch all layers, and most probably do a reload

This model has worked for years, and was very robust and performant. But it always lacked:

  • Flexibility and adaptability
  • Support of modern federated DWH architecture
Finally with BW 7.5 this has changed, now we have the: SAP BW Dynamic Star Schema - Separating Star Schema Modeling from Fact-Table Modeling

This is the new proposal from SAP, which give us: Flexibility and adaptability and Support of modern federated DWH architecture.

SAP has published this document which I strongly recommend you to check:


I recommend you to focus your attention on page 7 and onwards.

Enjoy Snowflaking!






Finally! SAP BW A Dynamic Star Schema!

For years, we (BW guys) have used the classical BW schema:

  • Multiple layers, and data is moved from layer to layer
  • If you need something new (add a field, etc.), usually you have to touch all layers, and most probably do a reload

This model has worked for years, and was very robust and performant. But it always lacked:

  • Flexibility and adaptability
  • Support of modern federated DWH architecture
Finally with BW 7.5 this has changed, now we have the: SAP BW Dynamic Star Schema - Separating Star Schema Modeling from Fact-Table Modeling

This is the new proposal from SAP, which give us: Flexibility and adaptability and Support of modern federated DWH architecture.

SAP has published this document which I strongly recommend you to check:

http://sapassets.edgesuite.net/sapcom/docs/2016/03/445ef806-647c-0010-82c7-eda71af511fa.pdf

I recommend you to focus your attention on page 7 an onwards.

Enjoy Snowflaking!






martes, 23 de agosto de 2016

Shadow IT: The new competitor



"Shadow IT is the proliferation of information technology (IT) that remains largely unknown and unseen by a company’s business IT department, which is concerning from both a security and a management perspective."

"When employees bypass IT and use their own, unsanctioned technology, we call it “shadow IT.” The cloud makes this really easy: all you need is a browser or a mobile device and you can connect to a virtually endless number of cloud apps for collaboration, file sharing, note taking, presentation building and just about every other tool you could ever want. The cost is minimal or even free. Who could resist?"

These 2 quotes from David Metcalfe define what shadow IT is.

What does this mean for us, external consultants?

Usually we try to develop a trust relationship with our customers' business areas and to achieve a perpetual honeymoon with the customers' IT department. We have to follow their rules and use the platforms they have chosen probably decades ago.

We deliver a product in X time and most probably with a not so appealing UI, since we were using IT’s approved platforms. Then, comes shadow IT and produces the same application in 1/3 X time and probably with a more appealing interface. Now, we have new a competitor: shadow IT.

Gartner is predicting that by 2016 35 percent of IT expenditure will go toward shadow IT.

The case for shadow IT is that it drives innovation and removes boundaries. The dark side is that it is uncontrolled, unmanaged and potentially not secure enough.

One area where us, external consultants, can contribute to IT departments is in the creation of change and configuration management for the shadow IT applications. By defining simple rules and processes we can help to avoid detrimental outages or security breaches.

For example, if the person from shadow IT decides to take a sabbatical year, we don’t want the business to become unable to do use their trendy forecast application.

Have you started a conversation about shadow IT with your customer?

viernes, 27 de mayo de 2016

Data visualisation tool + In memory Database in 10 minutes / Docker + Spark + Zeppelin

Hi,

Have you ever wanted to have an ultra fast in-memory database and a real time visualisation tool where you can create charts from your SQL query results?

Now you can! It can all be achieved using open source software and the effort of setting it up is minimal.

In a nutshell:

  1. Install Docker (https://www.docker.com)
  2. Download a Docker image that contains Spark + Zeppelin
  3. Execute the Zeppelin example

Install Docker
If you are in the IT world and you do not know what docker is; I recommend you to find out here.
The docker installation process is very straight forward. If you are using a MAC or Windows 10, I recommend you to install the beta version, which is much more efficient. The beta version requires Windows 10.

Download the Docker image
Once you have docker installed, download an image from: https://github.com/dylanmei/docker-zeppelin

As mentioned on the web page, you only need these two commands:

docker pull dylanmei/zeppelin
docker run --rm --name zeppelin -p 8080:8080 dylanmei/zeppelin
The first command can take from three to ?? minutes depending on your internet connection speed. In my case, it took four minutes.

Execute the Zeppelin example
After executing the second docker command which runs the image, you will see something like this:


This means that Zeppelin  (the visualisation tool) and Spark (the in-memory database) are ready.

Open your browser to this address: http://127.0.0.1:8080. Use Chrome or Firefox.

And you should see something like this:


Click on the Zeppelin Tutorial note. 


Click save; this will cause the notebook to have the needed dependencies.

Then click the "play" icon for the first part or paragraph as it is called in Zeppelin.


As you can see, this will create a table called "bank" (1) from a text file located in (2).

Now you can start to explore the data:


Feel free to execute any of the charts, or alter the SQL statements to start exploring the data you just loaded.

In my next post I will write about using more advanced features.


viernes, 29 de enero de 2016

From IOT trough the cloud into analytics

I still remember the example from several years back telling us what the IOT is going to be about: "Your fridge will be connected to the internet and will order milk and eggs when you are running out". Sounded kind of appealing but, I thing IOT is much more interesting than that.

One nice example of what IOT can achieve is described in this article of Wired magazine.

I myself wanted to experience the topics IOT, Cloud, and analytics combined together, with a budget of 50€, during 5 days, at a rate of 1.5 hours per day.

The result of my experiment is this:



I've got an Onion Omega microcomputer (30 USD) that runs Linux and can be connected to any type of sensor, and connects to a Wi-Fi network. I originally wanted to use a Raspberry PI Zero (5 to 10 USD) but they were in Back Order.

In this case I decided to work with a temperature sensor (5 USD). So, inside the Onion I put a Python program (thanks Maria!) that reads the temperature from the sensor and send it to Microsoft Azure Cloud.

Then I used the newest Microsoft analytics services, called PowerBI to create some charts with the resulting data.

In a Nutshell, any small and cheap micro computer connected to a equally small and cheap sensor, and to a WI-FI network can send its reading to a server in the cloud. Then with any device (PC, Smart phone, tablet) connected to the Internet you can analyse and visualise this information. Great, isn't it?!

Right now I'm in the process of creating a platform which enables anybody to collect and analyse information from sensors (whatever sensor you can imagine) in a cheap and fast way. So, if you have an idea where you can use this, please contact me.

Some pictures:


1. The Onion with an USB stick (A) and the temperature sensor (B). The Onion is on top a pack of cigarettes so you can guess its size.




2. A simple dashboard (running on my iPad) that shows the temperature of the last 30 mins, the current temperature, and the average of the last 5 days. I created a couple of artificial temperature variations by putting some ice near the sensor.