BBDC

"Berlin Big Data Center"


Our  Mission is to perform groundbreaking research and development, to train the "data scientists" of tomorrow and to create solutions that facilitate the deep analysis of massive amounts of heterogeneous data sets and streams at high velocity. In order to optimally prepare industry, science and the society in Germany and Europe for the global Big Data trend, highly coordinated activities in research, teaching, and technology transfer regarding the integration of data analysis methods and scalable data processing are required. To achieve this, the Berlin Big Data Center is pursuing the following seven objectives: Pooling expertise in scalable data management, data analytics, and big data application Conducting fundamental research to develop novel and automatically scalable technologies capable of performing “Deep Analysis” of “Big Data”. Developing an integrated, declarative, highly scalable open-source system that enables the specification, automatic optimization, parallelization and hardware adaptation, and fault-tolerant, efficient execution of advanced data analysis problems, using varying methods (e.g., drawn from machine learning, linear algebra, statistics and probability theory, computational linguistics, or signal processing), leveraging our work on Apache Flink Transfering technology and know-how to support innovation in companies and startups. Educating data scientists with respect to the five big data dimensions (i.e., applications, economic, legal, social, and technological) via leading educational programs. Empowering people to leverage “Smart Data”, i.e., to discover newfound information based on their massive data sets. Enabling the general public to conduct sound data-driven decision-making.

Our  Mission is to perform groundbreaking research and development, to train the "data scientists" of tomorrow and to create solutions that facilitate the deep analysis of massive amounts of heterogeneous data sets and streams at high velocity.

In order to optimally prepare industry, science and the society in Germany and Europe for the global Big Data trend, highly coordinated activities in research, teaching, and technology transfer regarding the integration of data analysis methods and scalable data processing are required. To achieve this, the Berlin Big Data Center is pursuing the following seven objectives:

  1. Pooling expertise in scalable data management, data analytics, and big data application
  2. Conducting fundamental research to develop novel and automatically scalable technologies capable of performing “Deep Analysis” of “Big Data”.
  3. Developing an integrated, declarative, highly scalable open-source system that enables the specification, automatic optimization, parallelization and hardware adaptation, and fault-tolerant, efficient execution of advanced data analysis problems, using varying methods (e.g., drawn from machine learning, linear algebra, statistics and probability theory, computational linguistics, or signal processing), leveraging our work on Opens external link in current windowApache Flink
  4. Transfering technology and know-how to support innovation in companies and startups.
  5. Educating data scientists with respect to the five big data dimensions (i.e., applications, economic, legal, social, and technological) via leading educational programs.
  6. Empowering people to leverage “Smart Data”, i.e., to discover newfound information based on their massive data sets.
  7. Enabling the general public to conduct sound data-driven decision-making.