Big Data - Many Talk About It, We Do It!
We’ve been working on big data topics in our labs and with our clients for quite a while now. Over time, we built a framework of technologies and utilities we can build data driven projects on. We call it ti&m analytics.
ti&m analytics is based on the Lambda Architecture, an industry best practice using only open source technologies. Data driven or big data ventures have requirements in two categories:
- Real-time: monitoring, recommendations, fraud detection
- Historic: analysis of classifications, statistics, exit points and so on
In order to fulfill those requirements, Lambda uses the following four main building blocks:
- Data integration integrates data from all sources. These can be REST services or sensors (push services) as well as file loads and other periodic loads (pull loads).
Technologies: kafka, Storm
- Real time / speedlayer is mainly used for monitoring and alerting use cases and is not able to calculate anything backwards in time.
Technologies: spark streaming, atmosphere.io
- Batch / serving layer is used for calculations on historic data and the storing of such. Technologies: HDFS, Spark, HBase
- Visualization / data access results are displayed either in dashboards or interactive tools as Tableau or using Hive.
Technologies: ti&m dashboard, Tableau, Hive
Using the Lambda Architecture, we built numerous solutions ranging from web analytics to social media monitors and banking appliances. This allowed us to extract a reusable framework including integration for social media (Twitter/Facebook), social collaboration (jive), user tracking across multiple channels, core banking solutions and many more to come.
We worked with a solid library of already implemented analyses, for example process analysis to visualize where clients/users get stuck during processes or where they exit. Other examples would be sentiment analysis, usage statistics by geo location as well as identification of top users and what they are focused on.
Visualization is the key to any data driven initiative! We built a customizable dashboard to visualize real time and historical data. As we don’t believe in anything proprietary, the collected data can also be used with visualization tools like Tableau or interactive querying using Hive, Pig or Drill. For recommender systems or personalized content, data can also be fed into business applications.
Regulation or data governance might require to have all data on premise. Our solution ships using virtual machines and can be installed on the existing infrastructure. We also offer to have shared and privately hosted infrastructure in our datacenters. Consequently, the solution is perfectly suitable to build MVPs, but also for long term production usage.
Digital analytics as a whole (organisational and technical) is very complex, therefore implementation has to be done step by step. Primary data sources and how they yield business values have to be identified. Imminent business value can’t always be guaranteed as value in data grows over time. A good example would be collecting usage data, which over time allows deeper insights into how users behave and how they react to changes.
Once the primary goal and the long-term vison are agreed upon, implementation starts. We set up the needed infrastructure either on or off the premise, with the main goal being the full integration within the first iteration. Additionally, the data architecture is defined, including how and where data is stored and in what access patterns it is used. Last but not least, we set up the visualisation using our ti&m dashboard or Tableau/Hive etc.
Please get in touch – come by for a Lab Visit at ti&m!
When it comes to managing containers, Docker is often the first choice. But Podman can also be an alternative. In this blog post, we take a closer look at the differences between the two tools and the points you need to consider if you plan to use Podman.Mehr erfahren
Back in November, we told you about the very first ti&m code camp, where our employees were tasked with finding innovative solutions to several technical challenges. In today’s interview, we have decided to speak with the winning team about “SAM”, their artificial intelligence application, and find out more about how the solution was built, how accurate it is and what their plans for the future are.Mehr erfahren
ti&m surfer Moritz Baumotte liebt es Neues zu lernen. Er hat einen enormen Wissensdurst und grosses Interesse an der Software-Architektur. Dank der ti&m academy kann er sich regelmässig weiterbilden. Der Kurs «Certified Professional for Software Architecture – Foundation Level» (ISAQB-FL) mit Gernot Starke hat ihm besonders gut gefallen.Mehr erfahren
Digitale Identität // Eine zuverlässige digitale Identität ist zentral für eine sichere Informationsgesellschaft. Einige europäische Länder stellen für ihre Einwohner umfassende digitale Identifikationssysteme zur Verfügung. Wo steht die Schweiz und welche Instanzen könnten eine globale digitale Identität bereitstellen?Mehr erfahren
Durch das Projekt SHIP haben Krankenversicherungen und Leistungserbringer eine gemeinsame Sprache gefunden. Im Interview erklärt der CEO der SASIS AG, Domenico Fontana, warum das Projekt ein Meilenstein für die Digitalisierung im Gesundheitswesen ist und welche entscheidende Rolle ti&m in dem fast zehn Jahre dauernden Unterfangen spielte.Mehr erfahren