Skip to content

Releases: neo4j-contrib/neo4j-streams

Release 3.4.2 of the Neo4j Kafka integration

06 Jun 12:57
Compare
Choose a tag to compare

We're very happy to present the new 3.4.2 release of the Neo4j Kafka integrations.

Our focus since the last release was on making our integration easier to use and fixing some issues.

We introduced several new features in the Sink:

The fixes #99: Provide a roundtrip-sink-config allows you to ingest data that that comes from another Neo4j instances as CDC events.

In a similar way fixes #154: provide a common pattern for ingestion allows you to define simple expressions in order to extract data from any nested event structure and transfrom that data into Nodes and/or Relationships.

For example to create a user and their purchases from the users and orders topics:

streams.sink.topic.pattern.node.users=User{!userId}
streams.sink.topic.pattern.relationship.orders=User{!userId} BOUGHT{purchase.price, purchase.currency} Product{!productId}

The fixes #102: Manual commit behavior for handling errors and retrievals allows you to use a manual committing consumer, moreover improves the streams.consume procedure allowing to read data starting from a specific partition/offset.

Breaking changes

There is a little change about the Sink management, with the fixes 160: change the streams.sink.enabled to false the Streams plugin now has, for the property streams.sink.enabled, the default value set to false so you need to explict set it to true otherwise, if you only specify the topic mapping, you'll see a WARN message into the neo4j.log

We also fixed several issues:

Neo4j-Streams Release 3.5.0 and Kafka Connect Plugin Release 1.0.0

23 Jan 13:22
98d06d1
Compare
Choose a tag to compare

We're excited about the new release. Big thanks to @conker84 from our partner Larus IT for all the hard work of building this integration.

Thanks to your feedback the Neo4j extension saw a number of fixed issues and more testing in the field, please continue to try it for different use-cases and let us know how well it works for you. Special thanks to @lju-lazarevic and @sarmbruster

Kafka Connect Plugin Release 1.0.0

Finally the Neo4j Kafka integration is available as Connect Plugin. We provide the sink functionality which will also be available on Confluent Hub.
We provide the ability to test the plugin locally with the docker compose setup we provided.

For more details see the docs

New Procedure

We added a new procedure to receive events from a topic and use them in your Cypher statement. That's useful both for testing and also for consuming events directly as part of another workflow.

We describe how to use it in the procedure documentation.

Batching

To allow better control of batching, we added a configuration parameter, batch.size that together with the Kafka setting max.poll.records allows to consume events in bulk(batch)

Bugfixes & Enhancements

Neo4j-Streams Release 3.4.1 and Kafka Connect Plugin Release 1.0.0

23 Jan 13:02
Compare
Choose a tag to compare

We're excited about the new release. Big thanks to @conker84 from our partner Larus IT for all the hard work of building this integration.

Thanks to your feedback the Neo4j extension saw a number of fixed issues and more testing in the field, please continue to try it for different use-cases and let us know how well it works for you. Special thanks to @lju-lazarevic and @sarmbruster

Kafka Connect Plugin Release 1.0.0

Finally the Neo4j Kafka integration is available as Connect Plugin. We provide the sink functionality which will also be available on Confluent Hub.
We provide the ability to test the plugin locally with the docker compose setup we provided.

For more details see the docs

New Procedure

We added a new procedure to receive events from a topic and use them in your Cypher statement. That's useful both for testing and also for consuming events directly as part of another workflow.

We describe how to use it in the procedure documentation.

Batching

To allow better control of batching, we added a configuration parameter, batch.size that together with the Kafka setting max.poll.records allows to consume events in bulk(batch)

Bugfixes & Enhancements

First Release Neo4j Kafka Integration Plugin

27 Nov 13:22
Compare
Choose a tag to compare

In the past year we got a lot of questions for an integration of Neo4j with Apache Kafka and other streaming data solutions. So a few weeks ago, with the help of our Italian partners LARUS (esp @conker84) and our colleague @sarmbruster, we started to work on a first integration.

Today we want to make this available in a first release under an Apache License for you to try out and test. It works with Neo4j from 3.4.x and Kafka from 0.10.x.

It offers three capabilities:

  1. a procedure to quickly send data to kafka
  2. a neo4j producer as a change data capture (CDC) source to transmit change events to downstream systems
  3. a neo4j consumer to turn Kafka messages into graph structures using templated Cypher statements

Just graph the attached JAR, drop it into your $NEO4J_HOME/plugins directory and add a config like:

kafka.zookeeper.connect=localhost:2181
kafka.bootstrap.servers=localhost:9092
# and
streams.procedures.enabled=true
# or
streams.source.enabled=true
streams.source.topic.nodes.<topic-name>=PATTERN
# or
streams.sink.enabled=true
streams.sink.topic.cypher.<topic-name>=CYPHER-QUERY

You can find the details in the documentation: https://neo4j-contrib.github.io/neo4j-streams

We also published a more detailed Blog post on Medium about it.

We would love you to test it out and give us feedback, either here as GitHub issues or by answering our short survey.