First of all, we would like to state that we are very proud that our “T360 solution” has functionalities that are completely in line with the expectations and trends for 2017! A point that recently got validated twice, first in Forbes and afterward by Alexandra Levit, currently a writer for the NY Times mentioning dFakto’s achievements in VR and 3D-printing on QuickBase as the number 1 trend in project Management Developments for 2017.

3D Printing and Virtual Reality for project management?

Many projects today take place in the cloud and via distributed teams, which can make it difficult for team members to feel fully immersed in their work experiences. Virtual reality and 3D printing technology reinvigorate the project lifecycle so that tasks and collaboration efforts resonate more strongly.

As stated by our CEO Thibaut de Vylder during the interview for the article in forbes: “We created a UX experience in project management that allows people to update their project status in 30 seconds or less each month just by focusing on what’s most important.”

The virtual reality experience provides full immersion with an app that allows Project Managers to see progress charts in a personal theatre style. Focus your eyes on a particular chart, and it grows. You can dive into elaborations the same way. The 3D printing component, similarly, facilitates the sense of touch. It creates something you can put on your desk, unlike a digital report that you can’t see once you close it.

What we believe

We definitively believe that these technologies reduce the information gap by providing the information and insights the way that people expect it. Not only do future data-driven platforms need to be open, modular and techno-agnostic but they also need to deliver value to the stakeholders.

We strongly believe that the following principles are the key to success driven results.

  • Open: to any useful technology that provides value.
  • Modular: the considered technology must integrate smoothly and can be replaced on demand.
  • Techno-agnostic: no religion. Just cost effectiveness!

Take a look here if you want to know more about T360.


As you’ll surely know the overall objective of the BCBS239 regulation is to strengthen banks risk data aggregation capabilities and risk reporting practices, as it relates to credit risk, operational risk, market risk and liquidity risk. Now this regulation is mandatory for the ‘systemically important banks’ – known as G-SIBs – such as BNP Paribas, ING and Belfius (in Belgium). A full listing of these banks is available here. It is however also strongly recommended that national regulators enforce the same regulation principles to banks known as D-SIBs, 3 years after their identification as such.

The new regulations insist that G-SIBs should be able to produce their aggregated risk numbers ‘within a short time, like in a crisis mode’. The overall aim is that a better understanding of their risks will help improve decision-making processes at banks. Remember 2008? Well the financial authorities still remember it well too … BCBS239 is a step to ensure that history doesn’t repeat itself.

What is the BCBS239 regulation?

A regulation in reaction to the 2008 crisis, the BCBS239 regulation outlines 14 principles under 4 main chapter headings including I. Overarching governance and IT infrastructure; II. Risk data aggregation capabilities; III. Risk reporting practices and IV. Supervisory review, tools and cooperation. Even though it is a principle based-regulation with few clear and defined metrics that can be used to monitor compliance, it is designed to improve the reporting and supervision of risk within large banks.

Do you have time to wait for someone to give you one single version of the truth?

In the long term, the solution seems simple: streamlined data aggregation as an input for all BCBS239 requirements. However this is easier said than done as many banks have legacy IT systems that are not adapted to the improved financial regulation reporting demands. Moreover the data necessary to feed the reporting needs is not available from one single data source but is historically spread throughout the organisation.

Making faster and more accurate reporting a reality is one reason large banks have appointed a Chief Data Officer. This CDO will need to make sure that the risk numbers are regularly available and that they are accurate and verified. Legacy IT systems, with data spread over a multitude of sources, and additional pressure on operational and commercial use of data is just adding to the difficulties facing the CDO.

We need those numbers… now

Let’s imagine that you have all the data available, and it has been aggregated and verified. In itself that already seems like a great achievement. But now imagine you need to be able to produce the necessary dashboards and insights on the spot? The dream seems a bit further away now, doesn’t it? The reality is that many G-SIBs cannot currently deliver accurate numbers within a limited time frame, and it would be even harder if they were in a difficult ‘crisis’ period. This inevitably will take time to develop and even longer to set up and be put in place. Can the G-SIBs wait for concrete results in IT to deliver the aggregating risk on operations?

If you think the answer is yes … then that’s fine you don’t need to read any further. But if you think the answer is no … then dFakto has a cost-effective solution for you that could be worked out and delivered within just a short time frame.

dFakto are experts in managing data and aggregating results. They have a long track record of sourcing and storing confidential data from within IT systems, verifying it, enhancing it as necessary and then presenting it in a way that is easy to understand by everyone. Their data factory and data vaulting techniques allow for real-time dashboards meaning that if you have a risk aggregation question tomorrow you can also answer tomorrow.

If you are looking for a real answer to this problem, and in a short time frame then don’t just put more consultants on the job, hire people who can deliver an effective solution that includes business intelligence and technical know-how.

Ask dFakto to explain how their expertise and experience could be put to good use and help you solve a pressing regulatory issue within just a few months.


The goal is to turn data into information, and information into insight.
Carly Fiorina, Former CEO of HP

 

There are many different factors that contribute to dFakto’s ability to deliver insights and results quickly and accurately. The main factor is its proprietary “DataFactory”, that uses a “Data Vaulting” process for inputting data.

Here is what dFakto has put in place :

dFakto’s understanding of the problem

When dFakto looks to solve a client’s data analysis problem, it starts by understanding the business problem. Not from the amount of data available within the company.

First, dFakto Business Analysts are looking for precise answers needed to take a decision.

Second, they are looking for data that will be able to provide insights.

Consequently, they only source the data they need in order to solve the raised problem.

dFakto’s systematic methodology

The « DataFactory » is conceived around a model of how the client does its business. In this way, it mirrors the critical information that is needed to solve a client’s problem. Furthermore, the incoming data is broken down into its most elemental parts and then archived.

This means that if ever new data fields or new sources are added, there is no need to configure again the database architecture.

dFakto’s DataFactory

dFakto creates a “DataFactory” with the data it receives from the client and uses a “Data Vaulting” process for inputting data. It stores everything, regardless of the operating system it comes from. It is a rigorous and systematic way of storing the data. Therefore, it ensures the history of all changes that may appear with the data. Consequently, it enables an auditor to trace values back to their original source. It also lets the Project Manager see who has updated a particular data field, and when.

This has the objective to have an easy traceability of every change that appear and therefore create insights for the client.

The fundamental principle of the « DataFactory » is that there is no distinction at this stage between good and bad data. This aspect is only considered and worked on after the data has first been stored.

There is only “a single version of the facts”.

dFakto’s insights

The advantages of this type of a system are multiple. The input data is always raw data with no previous manipulations. In this way, they can track exactly:

  • The origin of the data;
  • The accuracy of the data;
  • The responsible of the data and;
  • The location of the data.

Better still, it doesn’t matter how complex the model becomes or how many new data sources are added.

In this way, there is never any need to go back to the beginning and start the whole process again. It means that if another problem arises and the same data is required, then everybody is able to access the same ‘input’ at any time.

In conclusion, dFakto therefore doesn’t necessarily do “big data”, though it can and does. Instead, it works with the “the right data”. The company uses its business analysis expertise and experience to unpack a client’s problem and see precisely what information is needed to answer a specific question. This results to a reduction of time spent on collecting and checking data. Consequently, the time spent on understanding and interpreting what the results mean increases. Better still, clients will enjoy peace of mind, as they know that the answers and insights generated are based on the most recent available data.

 

Find out more from our friendly team of business and technical experts at info@dfakto.com or +32(0)2.290.63.90.