Free Deep Dive "Digitising German SMEs Together" on 26.10. Register now
Logo von Alligator Company Software GmbH

Alligator Company Software GmbH

Data analysis, data science, data engineering, business intelligence
5
1 Reviews
To the website Send message

Unternehmensdarstellung

Our employees are experienced consultants with project experience totalling over 150 years. We have successfully supported several Kimball Dimensional, Inmon 3NF and modern Datavault methods since 2002. We also refer to ourselves as full stack DWH developers, as we have also developed front-end solutions in addition to back-end architectures, ranging from dashboards and reports to analytics. We have realised data integration projects with ETL tools and various databases. Recently, we have been following and supporting the trend towards analytics engineering, which, as part of the modernisation of the tool landscape in the course of digitalisation and modernisation in the cloud, focuses more on good integration and maintainability and thus increasingly consolidates data and its handling as an asset in the company. Software engineering methods are also becoming increasingly important in DWH and business intelligence.
Daily rate
790€/day
Annual turnover
1-2 million
Employees
8 employees in total
Company type
Established service provider, nearshoring provider
Homepage
https://alligator-company.com/

References

IFRS-17 / DWH Data Integration

Hanover RE

Verified ratings

Communication
Adherence to deadlines
Quality

04/2020 - until today

Hanover

Request similar project

Comment

During our collaboration, we got to know Christian Drossart as a competent and reliable person. In addition to his professional expertise, he was characterised by a high level of empathy, helpfulness and team spirit. Even in stressful situations, Mr Drossart demonstrated a high level of resilience and always remained uncomplicated and friendly. For this reason, he was also highly valued by the other employees and contributed significantly to the progress/success of the project.

Project description

  • Data Integration with Informatica PowerCenter for the IFRS17 report.
  • Building Logging Template system
  • Introducing Initial Recognition and Bypass
  • Implementation of multi-step Protection Cover calculation
  • Development on the components Allocation, Projection, Risk Adjustment, Contractual Service Margin, Deferred Acquisition Costs and the Postings
  • Caring of the internal Wiki
  • Data integration development on the implementation of the automatisation of the QRTs of Solvency II
ELT
Data Engineering
SQL
Windows PowerShell
ETL
Data warehouse
Data Management
Informatica PowerCenter
Business Intelligence
Oracle
Exasol
Microsoft Server

DWH 3.0

VHV Insurance

Evaluation

No rating available

01/2016 - 01/2017

Hanover

Request similar project

Project description

Support with the introduction of the MID BI Solution. Advice on the use of modelling methods and tools, in particular on the points:
- Model-driven procedures and appropriate modelling for the collection, specification and realisation of requirements of a DWH
- Data and layer architecture of the DWH
- Coaching of the developers on the optimal tool use of MID Innovator Information Architect, taking advantage of existing automatisms, model transformations and checks and generators.
- Integration of the innovator into the VHV IT infrastructure and rights management
Conception and development of further automatisms:
- Export of DataVault mapping metadata to SAS DataIntegrator
- Extension of the database DDL generation for the creation of table spaces according to VHV architecture guideline
Planning and support for the migration of the
MID Innovator Information Architect from v12.3.1 to v13.1.2

Tools used:
Data Vault 2.0, DB2 11.0+ with use of the Blu-
Acceleration, SAS DI and Talend, Innovator, Eclipse, FlyWay, JMeter

Tasks:
Development, modelling and ETL of Data Vault requirements, tool implementation and data architecture

Data Architectures
Docker
ETL
Data warehouse
Jmeter
Kubernetes
Data Vault
Talend
Insurance Industry
Infrastructure as Code
Eclipse
Requirements Analysis
Business Intelligence
IT Infrastructure
Coaching
Migration
DB2
Mid Innovator

Modernisation Analytics Platform

Uniper

Evaluation

No rating available

03/2021 - 05/2021

Remote

Request similar project

Project description

Tasks

Steering of development topics as solution architect.

Develop a data strategy in the cloud.

Tools used

Azure, DataOps

Azure
Data Strategy
DataOps

Data Vault Bank central DB

HSH Nord Bank

Evaluation

No rating available

06/2018 - 11/2018

Kiel

Request similar project

Project description

Requirements analysis and implementation of requirements in Data Vault. And in the udg Eclipse Generator. Implementation in Oracle packages and views. Working in an agile Scrum environment, analysing the results in complex SQL and PL SQL statements.

Tools used:
Oracle 12.2 SQL, PL/SQL, Eclipse

Tasks:
Development, modelling and ETL of Data Vault requirements for the central

SQL
ETL
Banking
Data Vault
Eclipse

Data Vault Modelling and ELT

VHV Hanover

Evaluation

No rating available

07/2016 - 12/2016

Hanover

Request similar project

Project description

Data Vault modelling with MID Innovator and SAS-DI. Organisation and consulting of the review process. Consulting of the modellers in the implementation project. Creation of DB2 Data Mart Views and definition of Business Vault rules in BPMN.

Tools used:
MID Innovator, DB2, SQL

Tasks:
Development and modelling guidelines, modelling of data models in the DWH core

ELT
BPMN
Data Vault
Mid Innovator

DWH 3.0 VHV, Vereinigte Hannoversche Versicherung

VHV Insurance

Evaluation

No rating available

02/2017 - 12/2017

Request similar project

Project description

Support with the adaptation of the MID BI Solution.
Advice and coaching on the use of modelling methods and tools, in particular on the points:
- Model-driven approaches and appropriate modelling for the collection, specification and realisation of requirements of a DWH
- Data and layer architecture of the DWH
- Coaching of modellers and developers on the optimal use of the Innovator as a tool, taking advantage of existing automatisms, model transformations and checks.
- Design and implementation of a modelling approach optimised to customer needs and simultaneous adaptation of the automation to the new target architecture: data integration is switched to Talend EE with embedded ELT-SQL.
Conception and adaptation or new development of automatisms:
- Generation of VHV-specific DV2.0 DML-SQL for Datavault Hub, Link, Sat
- RefTable + RefTable-Sat
-Status-Sat for Hub/Link & RefTable
- Generation of Talend DV supply jobs using Talend JobScript & embedding DV DML SQL (see points before).

Tools used:
Solution: Data Vault 2.0, IBM DB2 BLU, SAS DI, Talend, Flyway.
Development process: Jira, git, Innovator, git, FlyWay, Jmeter.
Generator development: Innovator, Xpand/Xtend, Eclipse, Java, eclEmma, maven

Tasks:
Development, modelling and ETL of Data Vault requirements for regulatory bank reporting

ELT
SQL
ETL
Data Vault
Insurance Industry
Business Intelligence
Coaching
Apache Maven
Talend

DWH DataOps, architecture and introduction of Datavault Builder and Exasol

Vector.com, Stuttgart

Evaluation

No rating available

06/2020 - 03/2021

Stuttgart

Request similar project

Project description

Requirements analysis and implementation of requirements in the area of embedded software development for an automotive supplier. The introduction of DataOps methods in Microsoft Azure using Exasol, Datavault Builder. Consulting and training on modelling, development methods and cost analysis for the introduction of a new data warehouse infrastructure and development. Tools used: Datavault Builder, Exasol, DBT, Kubernetes, Docker, Azure DataOps Tasks: Design, consulting and training for the introduction of the infrastructure and development methods.

Tools used

Datavault-Builder, Exasol, DBT, Kubernetes, Docker, Azure, DataOps Tasks: Conception, consulting and training for the introduction of the infrastructure and the development methods

Azure
Docker
Kubernetes
Requirements Analysis
DataOps
Exasol

BI Architect and Data Vault Modelling

VHV

Evaluation

No rating available

09/2015 - 03/2017

Request similar project

Project description

- Modelling E-DWH and Data Mart with MID Innovator
- Data Vault modelling and automation.
- DB2 Column Store Database DB2 BLU.
- OBIEE Frontend Reporting.
- TDD and DataOps/DevOps for MID Automation and
SAS-DI / Talend batch jobs

Tools used:
MID Innovator, DB2 BLU, SAS-DI, SAS, OBIEE, batch processing, ELT/ETL, DataOps/DevOps, TDD

ELT
ETL
Frontend
DevOps
Data Vault
Business Intelligence
DB2
DataOps
Mid Innovator
Talend

Sales reporting tour operator

Company TUI AG

Evaluation

No rating available

09/2019 - 03/2020

Request similar project

Project description

In the existing DB2 data warehouse, Informatica PowerCenter was used to
Sales reporting supported. Evaluations of booking statistics and various
Datamarts were added and newly created.

Tools used:
DB2 UDB, DB2 BLU, SQL, Informatica PowerCenter, UC4 Scheduling

Tasks:
Development, analysis, documentation

SQL
Data warehouse
Informatica PowerCenter

Dbt-exasol Adapter, POC SBK / BI Architecture and DWH Modernisation

SBK, Munich

Evaluation

No rating available

06/2019 - 12/2019

Munich

Request similar project

Project description

Implementation of POC and consulting DWH modernisation with DBT and Datavault Builder on the Exasol database. Programming of the DBT adapter for Exasol in Python and Jinja. CI/CD using Gitlab.

Tools used:

Exasol, DBT, ELT, Docker, Gitlab, CI/CD, VisualStudio Code, Batch Processing, ELT/ETL, DevOps/DataOps, TDD, Python

Dbt-exasol Adapter, POC SBK In the project to modernise the existing data warehouse, analytics engineering methods such as git, CI/CD and ELT were to be introduced on the basis of the existing Exasol Analytics database. The processes for this were implemented in the test system of the Exasol DWH, using DBT and gitlab in the development of ELT batch processing and an example Datamart. The corresponding development environment for Python and DBT was provided using docker and docker-compose. The Exasol adapter for Exasol was developed in Python and Jinja for this purpose. Using data modelling in MID Innovator, the corresponding Datavault components were generated from a business object model as DBT models so that the data integration via hub, satellite and link could be generated using the Datavault patterns. In addition, Supernova Views were generated that again represented the Datavault integration model as entities including the versioning as DBT models. As a result, the business rules and deriving Datamart models could then be developed per DBT models in SQL. The model data, metadata and the DBT models including the SQL-based DBT models can be versioned in gitlab. By selecting the elements in the MID business object model, the generation can also be filtered specifically so that incremental and parallel development can be undertaken with multiple developers and corresponding feature branches in gitlab. At the end of the sprint, the feature branches can then be merged again to generate the sprint release. Tools used: Exasol, VisualStudio Code, Dbeaver, DBT Tasks: Development, modelling and ELT of data models in the DWH.

ELT
SQL
Data warehouse
GitLab
Business Intelligence
DataOps
Development Environments
Exasol
Analytics Engineering
Mid Innovator
Python

DatavaultBuilder Insurance Modelling and Implementation

Uelzen animal

Evaluation

No rating available

10/2017 - 03/2020

Request similar project

Project description

In the enterprise data warehouse, I created the data modelling using Datavault Builder and managed the loads accordingly. The data model was implemented according to the business analyst's technical specifications and included contract, damage, customer and payments. A special feature here was the bi-temporal source system, which was realised accordingly in Datavault with versioned hubs. Based on the initial model defined in this way, I created the SQL initial views for the Qlik front end and carried out the performance optimisations on the SQL server. In the process, the execution plans were analysed and appropriate measures such as indexing and businessvault materialisations were applied. In addition, existing Talend ETL was replaced and adapted to the new source structures of the EDWH.

Tasks:
Development

Tools used:
Datavault Builder, SQLServer, SQL, GIT, Qlik, Talend

SQL
ETL
Frontend
Git
Insurance Industry
Data Modeling
Talend

Data Vault Online Trade

MyToys

Evaluation

No rating available

03/2017 - 07/2017

Berlin

Request similar project

Project description

Requirements analysis and creation of concepts. Data Vault modelling and ETL in the area of online retail. Implementation in Oracle packages and view. Work in an agile Scrum environment.

Tools used:
Oracle 12.2, SQL, PL/SQL

Tasks:
Development, modelling and ETL of Data Vault requirements in online commerce

SQL
ETL
Data Vault
Requirements Analysis

BI Architect and Data Vault Modelling

HSH Nordbank

Evaluation

No rating available

06/2018 - 01/2019

Request similar project

Project description

Architecture consulting in the implementation project. UDG Data Vault Generator XText and Java. Physical DB optimisation on Oracle.

Tools used:
UDG Data Vault, Oracle, batch processing, ELT/ETL, TDD

ELT
ETL
Xtext
Data Vault
Business Intelligence
Oracle

BI Architect and Data Vault Modelling

Talanx - HDI Global SE

Evaluation

No rating available

04/2017 - 07/2018

Request similar project

Project description

Architectural responsibility, modelling guidelines and advice to the modellers in the implementation project. UDG Data Vault Generator XText and Java. IGC glossary and mapping transports. IDA logical modelling and implementation in Data Vault. Develop test scenarios for batch jobs with DBFit/Fitnesse.

Tools used:
UDG Data Vault, Oracle, batch processing, ELT/ETL TDD

Xtext
Data Vault
Business Intelligence

ZDP, HSH NORDBANK

HSH Nord Bank

Evaluation

No rating available

04/2018 - 06/2018

Kiel

Request similar project

Project description

Review of measures for the redesign phase of a central data distribution platform and review of the DataVault methodology. Recommendation of further measures. Design and conception of solution approaches for problem areas. Support of the architect in the architecture team. Clarification of questions and specification of architecture requirements. Development of solution approaches for conception problems, e.g. Multi-Active-Sat

Tools used:
Oracle 12.2 SQL, Eclipse

Tasks:
Development & Modelling and Data Vault Data Architecture

SQL
Data Vault
Eclipse

Structure Core Business Information Warehouse

Otto

Evaluation

No rating available

08/2021 - 05/2023

Remote

Request similar project

Project description

Tasks

System and business analysis of new source systems to be connected, taking into account DSGVO requirements and governance processes. Design and implementation of the data pipeline from the source via the DataLake ingest to the integration into the core DWH based on a DataVault modelling approach. Development of the solution architecture and introduction of required components into the software architecture. Project participants 20 people

Tools used

Databases Big Query & Exasol.

GCP PubSub, GCP Functions, GCP API Gateway, GCP Cloud Build.

DevOPS & terraform.

Tasks:

Development and modelling:

Datavault, dbtvault, Google BigQuery, Docker

Cloud Integration
Data warehouse
Terraform
Google BigQuery
DevOps
DSGVO
Business Analysis
Exasol
GCP

Informatica PowerCenter Data Vault Automation

Alligator Company Software GmbH

Evaluation

No rating available

01/2016 - 04/2017

Request similar project

Project description

A generator solution for Datavault was developed for a publishing group. The data models from PowerDesigner and corresponding Informatica mappings served as the basis for generating the mappings for hub, link and satellite in the target model, which was read from the PowerDesigner XML. The tool was developed by me in Java and delivered to the customer.

Tasks:
Development and architecture

Tools used:
Informatica PowerCenter, Java, XML, PowerDesigner

Java
Data Vault
Informatica PowerCenter
XML

Informatica, database travel company

TUI, Hanover

Evaluation

No rating available

12/2018 - 04/2020

Hanover

Request similar project

Project description

Requirement analysis and implementation of requirements in the area of a travel advertiser with the help of Informatica and DB2 LUW objects. Work in an agile Scrum environment. Analysis of results in complex SQL. Conversion of ETL routes with the help of the DBT tool into SQL objects and fUse of Amazon AWS.

Tools used:
DB2 LUW, Informatica, DBT, Amazon Redshift, AWS S3, s3cmd, Pyhton, DBT

Tasks:
Development, modelling and ETL for the central DB customer database

ELT
SQL
ETL
Data warehouse
AWS
Data Management
Data Vault
Informatica PowerCenter
Analytics Engineering
Cloud Integration
Scrum
Requirements Analysis
Snowflake
DB2

Data Warehouse Automation Exasol & Datavault Builder

Evaluation

No rating available

05/2021 - 06/2022

Request similar project

Project description

In the data warehouse, the data modelling was created using Datavault Builder. Exasol was used as the database. DBT was used for ETL/ELT processes.

Tools used:

dbt
Datavault Builder
Exasol

BI Architecture, Data Vault and Cloud Migration

TUI InfoTec

Evaluation

No rating available

01/2019 - 03/2020

Request similar project

Project description

Architecture consulting and implementation of DevOps/DataOps mechanisms and renewal of the data pipeline in the Python programming language. Define test scenarios with DBT.Scheduling of batch processing using Airflow. CI/CD using Gitlab.

Tools used:
Docker, AWS S3, SQL, DB2 BLU, Amazon Redshift, Snowflake DB, DBT, Airflow, Kubernetes, batch processing, ELT/ETL, DevOps/DataOps, TDD, CI/CD, Gitlab

ELT
Docker
SQL
ETL
Kubernetes
GitLab
AWS
Data Vault
DataOps
DevOps
Business Intelligence
Snowflake
DB2
Python

DWH Modernization

E.ON

Evaluation

No rating available

05/2020 - 07/2021

Munich

Request similar project

Project description

dbtvault data integration and data mart modeling

Data Lake
ELT
Data Engineering
SQL
Cloud Integration
Data warehouse
Data Management
Data Vault
Business Intelligence
Snowflake
DataOps
Analytics Engineering

PL/SQL development and data analysis

Josef Witt GmbH

Evaluation

No rating available

08/2020 - 10/2020

Request similar project

Project description

In the migration project, data analyses were carried out and PL/SQL routines were created for the migration of the data to a company merger of the operational systems.

Tasks:
Development

Tools used:
Oracle, PL/SQL

SQL
Oracle PL/SQL
Migration
Data Analysis

Snowflake, DBT in the travel business

TUI, Hanover

Evaluation

No rating available

12/2019 - 03/2020

Hanover

Request similar project

Project description

Introduction of DBT on Snowflake in AWS. Airflow scheduler and Python operator for DBT. The existing data warehouse, which had been running for 10 years, was to be renewed and migrated to the AWS Cloud as part of the cloud strategy. The lift-and-shift strategy with Informatica (the existing ETL tool vendor) was discarded due to a PoC. Using Python-based data pipelines, the data is integrated into a persistent staging area with JSON structures in Snowflake. External table definitions are generated as views in the S3 buckets. Based on this source data, the interfaces are technically historicised. These models are generated using Python and Ninja. Subsequently, the Datavault model is generated via DBT models and so-called flatmarts are generated in the output, which are made available for the frontend. The orchestration was done with Airflow and the deployment was done in Kubernetes Cluster. Using Gitlab, appropriate CI/CD pipelines were developed, which coordinated the Docker images for the data pipeline, the DBT dags and the corresponding DBT projects accordingly in order to execute developer and integration tests. Programming the Python pipelines and generating the PSA were my tasks.

Tools used:
DB2 LUW, DBT, AWS S3, s3cmd, Pyhton, Jinja

Tasks:
Development, modelling and ELT for the central DB customer database

ELT
Informatica
ETL
Frontend
Kubernetes
AWS
Snowflake
DB2
Python

Business Objects DS and Universe

IT.NRW

Evaluation

No rating available

05/2017 - 09/2017

Request similar project

Project description

New ETL routes were built into the existing data warehouse using BODS, which filled the data model according to report requirements. Based on this, I created the Business Objects universe for ad-hoc analysis and also my own reports for finished delivery. To optimise the performance of the existing ETL routes, execution plans were analysed and SQL was adapted accordingly or ETL was modified.

Tasks:
Development

Tools used:
BODS, Oracle, BusinessObjects Reports

ETL

Data Vault Bank Regulator

Berenberg Bank

Evaluation

No rating available

07/2017 - 04/2018

Hamburg

Request similar project

Project description

Requirement analysis and implementation of Business Vault requirements in the Data Vault. Adaptation of the existing Data Vault. Provision of data for the regulatory Bais software. Implementation in Oracle packages and views. Work in an agile Scrum environment. BSM reporting software used. Project procedure according to SAFE.

Tools used:
Oracle 12.2, SQL, PL/SQL

Tasks:
Development, modelling and ETL of Data Vault requirements for regulatory bank reporting

SQL
ETL
Banking
Scrum
Data Vault
Oracle

Informatica, database travel company

TUI, Hanover

Evaluation

No rating available

09/2019 - 03/2020

Hanover

Request similar project

Project description

Requirement analysis and implementation of requirements in the area of a travel provider.

AWS data pipeline into the Snowflake DWH with Docker images on Kubernetes clusters for data transport.

Evaluation of Infrastructure As Code using CloudFormation and Terraform. Serverless Services using Lambda.

Tools used: DB2 LUW, Informatica, DBT, Kubernetes, Docker, Snowflake, Amazon Redshift

Tasks: Development, modelling and ETL for the central DB customer database.

Docker
Informatica
dbt
ETL
Data warehouse
Kubernetes
Amazon Redshift
AWS
Infrastructure as Code
Requirements Analysis
Snowflake
DB2

Data Vault DWH Project

VHV Hanover

Evaluation

No rating available

05/2016 - 07/2016

Request similar project

Project description

Support and advice on the introduction of the MID BI Solution
- Coaching of the modellers on the optimal use of the Innovator tool, taking advantage of existing automatisms, model transformations and tests Conception and development of further automatisms for customer needs
- Innovator function for importing mapping information to existing reports in the form of model linkage between attributes within the data model.

Tools used:
MS SQL Server 2016, Data Vault 1.0 in combination
with bitemp. historisation, Innovator v12.3 with new Java engineering actions

Tasks:
Landscape Bodenkreditbank, Münster Integration of various data sources (including host data) in an MS SQL Server 2016 platform using a modified DataVault 1.0 Procedure

SQL
Java
Data Vault
Business Intelligence
Coaching

Datavault Automation with DBT on Snowflake with Airflow, DataOps

E.ON

Evaluation

No rating available

08/2020 - 09/2020

Munich

Request similar project

Project description

Review and advice on the concepts of Datavault modelling, automation and DataOps. Suggestions for improvements in architecture and implementation using DBT, Airflow, Docker, Kubernetes and Snowflake.

Tools used:
Datavault Builder, Snowflake, DBT, Kubernetes, Docker, Azure DataOps, Airflow

Tasks:
Review and advice

Azure
Docker
Kubernetes
Snowflake
DataOps

Project SPoT

NORD/LB - The North German Landesbank

Evaluation

No rating available

08/2013 - 09/2019

Request similar project

Project description

NORD/LB is setting up a data hub. The aim is to replace old host applications
and to set up a new delivery route for SAP Bank Analyzer 8.0. The Bank
Analyzer is required for Basel III.

Tools used:
DB2 9.7/10.2, IBM DataStage 8.7, IBM Clearcase/-quest 7.1

Tasks:
Documentation, programming, test and release management, deployment

Banking
DB2

Main focus

Data Engineering
Data warehouse
Data Vault
Data Modeling
Business Intelligence
Analytics Engineering
Data Analysis

Other skills

ELT
SQL
ETL
dbt
Data Management
DataOps
Data Integration
Git
Software Engineering
Snowflake
Performance Optimization
Exasol
Python
dbtvault
+2

Industries

Insurance Industry
0 - 10 projects
Banking Sector
0 - 10 projects
Automotive
0 - 10 projects
Retail
0 - 10 projects

Dein persönlicher Ideen- und Beratungsassistent

Nutze unseren KI-Bot, um gezielt Fragen zu diesem Dienstleister zu stellen, Inspiration für dein Projekt zu sammeln oder passende Alternativen zu finden. Schnell, einfach und rund um die Uhr für dich da!

en_GBEnglish

Send message

Stelle hier anonym deine Frage an das Unternehmen, z.B. nach Verfügbarkeiten, Sätzen und Referenzen.

An welche Mailadresse sollen wir die Antwort senden? Wir geben diese nicht an das Unternehmen weiter.