Skip to content

Commit dfb74c6

Browse files
docs: add a sequence diagram and a description (#1757)
* add a sequence diagram and a description * update descrpition based on feedback * Update README.md * Update README.md Co-authored-by: Mars Lan <[email protected]>
1 parent 4143fb9 commit dfb74c6

File tree

2 files changed

+13
-1
lines changed

2 files changed

+13
-1
lines changed

docs/README.md

Lines changed: 13 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,18 @@
33
DataHub is LinkedIn's generalized metadata search & discovery tool. To learn more about DataHub, check out our
44
[LinkedIn blog post](https://engineering.linkedin.com/blog/2019/data-hub) and [Strata presentation](https://speakerdeck.com/shirshanka/the-evolution-of-metadata-linkedins-journey-strata-nyc-2019). You should also visit [DataHub Architecture](architecture/architecture.md) to get a better understanding of how DataHub is implemented and [DataHub Onboarding Guide](how/entity-onboarding.md) to understand how to extend DataHub for your own use case.
55

6+
In general, Datahub has two types of users in mind. One has metadata, and use tools provided by Datahub to ingest metadata into Datahub; The other is to use Datahub to discover metadatas available within Datahub. Datahub provides intuitive UI, full text search capablitity, and graph relationship presentation to make the metadata discover and understanding much eaiser.
7+
8+
The following sequence diagram highlights the key features Datahub has, and how the two types of users - metadata ingestion engineers and metadata discover users, can take full advantage of the Datahub.
9+
10+
![datahub-sequence-diagram](imgs/datahub-sequence-diagram.png)
11+
1. It starts with ingesting your metadata into datahub. We provide a [collection of sample Python scripts](https://github.com/linkedin/datahub/tree/master/metadata-ingestion) for you. Those scripts work with the popular relationship databases, find metadata of the data source, and publish metadata in Avro data format to MetadataChangeEvent(MCE) Kafka topic.
12+
2. A MetadataChangeEvent (MCE) processor consumes Kafka message with given topic, and make necessary transformation, send to Generalized Metadata Service (GMS), and GMS persists the metadata to a relational database of your choice. Currently we support MySQL, PostgreSQL and MariaDB.
13+
3. GMS also checks the received metadata to find out whether there is a previous version. If so, it will publish the difference to Kafka’s MetadataAuditEvent (MAE) topic.
14+
4. MAE processor consumes MetadataAuditEvent message from Kafka, and persist to Neo4j & Elastic Search (ES).
15+
5. The frontend of Datahub talks to the metadata restful API services of GMS. The metadata discovering users can browse, search metadatas, get the details of metadata such as the owner, the lineage and other customer tags.
16+
17+
618
# Documentation
719
* [DataHub Developer's Guide](developers.md)
820
* [DataHub Architecture](architecture/architecture.md)
@@ -13,4 +25,4 @@ DataHub is LinkedIn's generalized metadata search & discovery tool. To learn mor
1325
* [Generalized Metadata Service](https://github.com/linkedin/datahub/tree/master/gms)
1426
* [Metadata Ingestion](https://github.com/linkedin/datahub/tree/master/metadata-ingestion)
1527
* [Metadata Processing Jobs](https://github.com/linkedin/datahub/tree/master/metadata-jobs)
16-
* [The RFC Process](rfc.md)
28+
* [The RFC Process](rfc.md)
91.2 KB
Loading

0 commit comments

Comments
 (0)