Over 10,000 physical typewritten documents from 1932 to 1941 had to be digitised, structured, and connected in order to create a single, centralised source of knowledge, for enabling the analysis of historical processes.
Presentation by Dr. Alessandro Negro, Chief Scientist at GraphAware and author of the Manning’s book Graph-powered machine learning, that covers the following topics:
Why unlimited scale is important when using graph databases
The new graph database scaling capabilities built by Neo4j developers
The role of graphs to support machine learning application
How Neo4j assists customers in scaling their applications
Concrete examples of machine learning projects that can leverage graph sharding
The recording is available as well: https://bit.ly/39ZqFVE
Christophe Willemsen, CTO at GraphAware, goes over some tips and tricks on Relevant Search with Neo4j’s Lucene based search engine.
Ever wondered how ML can be used to build a Knowledge Graph to allow businesses to successfully differentiate and compete today? We will demonstrate how Computer Vision, NLP/U, knowledge enrichment and graph-native algorithms fit together to build powerful insights from various unstructured data sources.
The answer to most general purpose graph modelling questions is “it depends”. This talk demonstrates the pitfalls of modelling without knowing use cases- it shows how two sets of people can produce two different models for the same set of data elements, and how use cases should guide the model.
So, for your brand new project, you decided to throw away your monolith and go for microservices. But after a while, you realize things are not going as smoothly as expected ;-)
Hopefully, a graph can help to detect antipatterns, visualize your whole system, and even do cross-service impact analysis.
In this talk, we’ll analyze a microservice system based on Spring Cloud, with jQAssistant and Neo4j. We will see how it can be helpful to answer questions like:
do I have anti-patterns in my microservice architecture ?
which services / applications are impacted when doing a database refactoring ?
is my API documentation / specification up to date ?
how to get an up to date visualization of my whole system ?
and more !
Visualizing a complex graph is a task of graph simplification and providing well-thought visual cues, the best UI goes unnoticed. This talk will summarize current approaches and present a novel user interaction pattern, which takes advantage of a performant Neo4j graph engine.
Unblocking dependencies benefits any organization that performs work concurrently. Dependencies are connected and modelling them as a graph surfaces those connections quickly, enabling decisions to be taken that promote zero waste and more efficient delivery.
Tracking end of line manufacturing issues to their source can be a daunting task. Boston Scientific, in partnership with GraphAware, has used the Neo4j platform to build a manufacturing quality tool that offers dramatic improvements to the time, quality, and quantity of investigations. In this talk we will review a manufacturing value stream in a graph and discuss the analysis methods available, which result in striking increases in business efficiencies, for this unique application. We will also present how the system was implemented within the existing data architecture and then scaled from a laptop investigational tool to an enterprise-grade solution with Neo4j Server.
When privacy matters! A series of challenges for chatbots in data-sensitive businesses such as healthcare and finance by Christophe Willemsen
Meetup: Integration of Chatbots in Healthcare and BFSI, Dubai, 1.11.2018
Vlasta Kus talked about the advantages of graph-based natural language processing (NLP) using a public NASA dataset as example. From his abstract: “[…] we are building a platform (from large part open-source) that integrates Neo4j and NLP (such as Named Entity Recognition, sentiment analysis, word embeddings, LDA topic extraction), and we test and develop further related features and tools, lately, for example, integrating Neo4j and Tensorflow for employing deep learning techniques (such as deep auto-encoders for automatic text summarisation).”