January 09, 2023
Gathering data for future use is perhaps on everyone’s agenda, however processing this large chunk of data for producing meaningful information and reports should always be the primary objective. Needless to say, with the revolution in technology, data contributors are increasing with greater magnitude. Be that your pockets with smart phones, your glasses with Google, your watch with Apple; data has been coming from all directions and now ofcourse the Internet of Things (IOT) is certainly another bigger and more generic source of collecting more data. Naturally the question that reverbs in almost all board and war rooms today should be “How to make best use of this data?”.
At Equations Work we have been quite instrumental in providing solutions for such problems. What really triggered me to write this post is the feedback/testimonial that we lately received from one of our leading investment bank client:
“We are now able make meaningful business decisions with the information available with us since a decade. Your recommendation of technology and consultancy has helped us analyze piles of data more effectively and efficiently. ”
Equations Work has has always believed in implementing 3P strategy when it comes to data. These 3P’s are as follows.
Timeout is never healthy for your users, especially for real time applications. Making relevant information available faster is the key to success and for this understanding, addressing only relevant data sections by filtering noise information is very important. Using right tools & technology in correct combination can reduce development time.
For example, many-a-times we have seen the usage of MangoDB with Map Reduce for piling up data and then calling it a big data. The price of using MapReduce is speed: group is not particularly speedy, but MapReduce is slower and is not supposed to be used in “real time.” You run MapReduce as a background job, it creates a collection of results, and then you can query that collection in real time.
Precision in analysis is key for successful business decisions. Producing accurate results is typically tricky for large volume of data because you really need to roll up, roll down, perform multilevel aggregations, collations, and apply OLTP & OLAP best practices etc. to produce meaningful summary. With several cases our focus has always been having maximum precision in producing meaningful information with mashup of technologies.
For example, one of key challenge we had the pleasure of overcoming was related to computing Stock Beta and Standard Deviation from large volume of stock related data from Bloomberg; where numbers were very important to the decimal points for Risk Managers. By vastly increasing the data we use, we can incorporate lower quality sources and still be amazingly accurate. What’s more, because we can continue to reevaluate, we can correct errors in initial assessments and make adjustments as facts on the ground change thus increasing precision.
One of the things that users dislike is user interfaces like Airplane Dashboard, with too much of information on same page. At Equations Work we believe that “If content is the King, then the presentation is the Queen”. Data presentation is of atmost importance; techniques, to name a few, that Equations Work follows and strongly recommends go as follows:
- Producing information relevant to user role and presenting in neat manner, one of the way to achieve this is by producing custom and configurable Views using Customized data model; combination with MVC, JQuery and Entity Framework is usually very helpful to implement such solutions.
- Smart and Dynamic user interfaces, by means of providing Gadgets that can be moved across pages, Dashboard with graphical elements along with links to provide details hence reducing information spread on reporting page.
- Mashup the GUI using rich user controls like Infragistics, Telerik
- Leveraging reporting and analytic tools like Tableau, Cognos adds value for interactive reporting, that can be integrated in applications.
“Equations Work got a chance to work with a big Investment Bank for producing statistical information on stocks and make it available for analysis. Information was around Beta and Standard Deviation about the stock for Risk Calculations where data accuracy & availability was utmost important, input was from Bloomberg for stock price and volatility historical information for different time periods, this data was very huge with the footprint of 40+ GB on the disk.
With the combination of approaches discussed above, state-of-art solution was delivered making the right and accurate information available just a click away!”