Explore projects
-
Acts as intermediary between ingest workflow and services tasked with extracting metadata from media
Updated -
serializing the mappings of ids into a nosql db (mongodb)
Updated -
Volumes record enrichment workflow based on Apache Flink
Updated -
Library to extract specific datafields from MARCXML records
Updated -
This is the second version of an API for the RDF data created by the linked swissbib project It is inspired by similar ideas coming from lobid.org and therefor the implementation is based on the Java / Scala Play Framework
Updated -
-
Endpoint for soap messages in general but normally sent by OCLC CBS pusher functionality Based on Play framework (https://www.playframework.com/) so soap messages are processed via modern REST endpoint
Updated -
Flink job for grouping Ntriples with same subject
Updated -
A small flink job to merge, filter and sort dbpedia output triples. Reads a file and outputs to a kafka topic.
Updated -
A small workflow which creates clusters from the dbpedia global ID management file.
Updated -
API as gateway for browsing functionality analyzes and prepares browsing values as search requests for the search engine
Updated -
Export archival data from swisscollections for Kalliope and Archivportal Europa
Updated -
straight forward service to support id concordance between former swissbib IDs and new IDs of the slsp era. service is based on play framework and deployed into k8s cluster
UpdatedUpdated -
creates index terms for various fields (topics) used for sorted browsing functionality in swisscollections.ch
Updated -
Filter out records from the sources (for example alma) that should not land in swisscollections
Updated -
-
microservice to feed ingested documents from various sources into the MongoDB data lake (source kafka / sink mongodb)
Updated -
normalize the encoding of documents delivered by SLSP
Updated -
Fetch information from parent and ancestor records
Updated -
Takes documents from mongodb and sends them to kafka
Updated