Our core business is divided in 3 sections : IT Development, Business Intelligence and Big Data / Data Science. In order to guarantee a technological watch and develop our expertise through these 3 pillars, we have set up 3 clusters corresponding to them.
The purpose of these clusters is to gather the consultants around specific themes, through the organization of workshops, courses, demos and internal projects aimed to increase their expertise.
This acquired and shared expertise helps employees to better understand the similar issues that can be present at our various customers.
Our consultants are also : Developers, Analysts, Testers, Project Leaders, Architects, Trainers, Advisors.
It is sometimes difficult to present data in a simple way and to show them to people who are not specialized in that field. Fortunately, there are tools available that can help you show your data in a simple and efficient way. These are the Data Visualization tools that the DataViz cluster helps you to master through projects and workshops. Then, clients can better understand the meaning of the data generated by their company.
A few technologies : Tableau, Qlik, Power BI, SAP Lumira, node.Js, SAP BI, Cognos, SQL Server Reporting/Analysis Services, Oracle BIEE, Oracle, SQL Server, MySQL, DB2, SAP Data Services, Oracle Data Integrator, SQL Server Integration Services,…
The database team has for goal to maintain the knowledge of technologies within Hermès. For example: PL/SQL, DBA, Apex, BI Publisher. But also other databases ‘technologies such as MySQL.
The objective of this team is clearly commercial. Indeed, thanks to the knowledge acquired with the different missions and within Hermès, we can organize seminars, create POCs and so approach clients and catch their attention.
The acquired expertise helped us create partnerships with external companies such as Digora, Sysphère etc. These business relationships are important and help us to find new clients.
Big Data et Data Science
The “new kid” of the group, the data science team is a multi-task one with a specific goal: extract data from the huge amount of data you collected.
Therefore, the cluster uses the latest technologies in Data Engineering and Advanced Analytics.
A few technologies : Hadoop, HDFS, S3, Azure Storage, Elastic search, Neo4j, Kafka, NiFi, Flink, Spark, Hive, Sqoop, Impala, Cassandra, MongoDB, Deep Learning, Machine Learning,…