Skip to content

Releases: neo4j/neo4j-spark-connector

Neo4j Connector for Apache Spark 4.1.1

11 Apr 12:43
Compare
Choose a tag to compare

Neo4j Streams Kafka Connect (7)

Hello Community!
We are pleased to announce the 4.1.1 of the Spark Connector where we fixed several bugs.

There are three files:

  • neo4j-connector-apache-spark_2.11-4.1.1_for_spark_2.4.jar, md5 2accb30aa039ddabc298bd7c8206e234
  • neo4j-connector-apache-spark_2.12-4.1.1_for_spark_2.4.jar, md5 9dd2022da450fa14a4da90bda1cbdbe2
  • neo4j-connector-apache-spark_2.12-4.1.1_for_spark_3.jar, md5 6bfe16d65f57e1469dabf47d963a62e3

What's Changed

New Contributors

Full Changelog: 4.1.0...4.1.1

Neo4j Connector for Apache Spark 4.1.0

29 Oct 17:34
ed1ea0c
Compare
Choose a tag to compare

Neo4j Streams Kafka Connect (7)

Hello Community!
We are pleased to announce the 4.1.0 of the Spark Connector where we fixed several bugs and introduced the official support for Spark's Structured Streaming APIs with Source and Sink!

This release officially enables Spark Structured Streaming API and lets you connect Neo4j (via Spark) to systems such as Kinesis, Kafka, Google Dataproc, Azure Service Bus, and any other which support Spark Structured Streaming.

There are three files:

  • neo4j-connector-apache-spark_2.11-4.1.0_for_spark_2.4.jar, md5 2cde396181f79950a541ca2e6e218909
  • neo4j-connector-apache-spark_2.12-4.1.0_for_spark_2.4.jar, md5 982292aeb829d06fc6571cb21e850dfd
  • neo4j-connector-apache-spark_2.12-4.1.0_for_spark_3.jar, md5 76016cdd45cb0ca929172cb723dfa3b6

List of resolved issues

  • Fixes #343: add Streaming docs (#346)
  • fixes #386: Streaming Source dups record in Databricks env (#387)
  • fix python failing tests (#385)
  • fixes #370: Sort Partitions by internal ID (#382)
  • fixes #348: repartition(1) - Makes loading very slow (#356)
  • Fixes #369: add parameters to filters (#377)
  • fix to clean up python in an example program (#380)
  • Bump driver version to 4.3.2 (#379)
  • fixes #262: spark 3 source (#374)
  • fixes #262: part 3/4: Spark 2.4 Source (#371)
  • Fixes #358: Ensure the same timezone is used across JVM and Neo4j (#359)
  • Fixes #364: add tests for Neo4j 4.3 (#365)
  • Fixes #361: Change constraint naming convention (#363)
  • fixes #357: transformed isValidQuery severity from ERROR to DEBUG (#360)
  • added javadoc generation
  • Update quickstart.adoc

Pre Release of Neo4j Connector for Apache Spark 4.0.3

16 Jul 14:52
bba4179
Compare
Choose a tag to compare

Neo4j Streams Kafka Connect (5)

Hello Community we are pleased to announce the 4.0.3 pre-release (not meant to be used in production and with no enterprise support guaranteed) of the Spark Connector where we fixed several bugs and introduced the support for Spark's Structured Streaming Preview with Source and Sink!

This pre-release enables the spark streaming API and lets you connect Neo4j (via Spark) to systems such as Kinesis, Kafka, Google Dataproc, Azure Service Bus, and any other which support spark streaming

There are three files:

  • neo4j-connector-apache-spark_2.11-4.0.2_for_spark_2.4.jar for Spark 2.4 and Scala 2.11 with md5 3d56400faf10d6c0a3544cb69ed0056a
  • neo4j-connector-apache-spark_2.12-4.0.2_for_spark_2.4.jar for Spark 2.4 and Scala 2.12 with md5 67b4d9a9fcf3d4b484e8d4a67fc33dea
  • neo4j-connector-apache-spark_2.12-4.0.2_for_spark_3.jar for Spark 3.x and Scala 2.12 with md5 aee927b3f1eedccfbed434c61fe2a839

For the Structured Streaming support -- documentation has not yet been published, but there is a PR where we're collecting feedback, please look at it and add ideas about how to improving it!

Please fill an issue if you have a problem!

Following the full list of changes:

Bug Fixes

fixes #315: wrong recursive call in type conversion (#321)
fixes #357: transformed isValidQuery severity from ERROR to DEBUG (#360)
fixes #348: repartition(1) - Makes loading very slow (#356)

Enhancements

fixes #333: Remove Transaction code LockClientStopped (#336)
test improvements (#319)
#262 part 1/4: Spark 24 Sink (#322)
Fixes #349: Create issue templates (#352)
Fixes #262: add Structured Streaming write support for Spark 3 (#344)
fixes #354: Change org of package to org.neo4j for maven central publishing
update docs for maven central release (#355)
Fixes #361: Change constraint naming convention (#363)
Fixes #358: Ensure the same timezone is used across JVM and Neo4j (#359)
fixes #262: part 3/4: Spark 2.4 Source (#371)
fixes #262: spark 3 source (#374)
Bump driver version to 4.3.2 (#379)
Fixes #369: add parameters to filters (#377)

Other/Infrastructure Enhancements

fix to clean up python in an example program (#380)
Only run docs workflow when something changed in doc folder (#367)
Disable lint:links task for now (#368)

Neo4j Connector for Apache Spark 4.0.2

19 May 15:45
Compare
Choose a tag to compare

Neo4j Spark Connector

Hello Community we are pleased to announce the 4.0.2 release of the Spark Connector where we fixed several bugs!

There are three files:

  • neo4j-connector-apache-spark_2.11-4.0.2_for_spark_2.4.jar for Spark 2.4 and Scala 2.11 with md5 3d56400faf10d6c0a3544cb69ed0056a
  • neo4j-connector-apache-spark_2.12-4.0.2_for_spark_2.4.jar for Spark 2.4 and Scala 2.12 with md5 67b4d9a9fcf3d4b484e8d4a67fc33dea
  • neo4j-connector-apache-spark_2.12-4.0.2_for_spark_3.jar for Spark 3.x and Scala 2.12 with md5 aee927b3f1eedccfbed434c61fe2a839

Please fill an issue if you have a problem!

For any question please refer to the Neo4j Community Forum

Following the full list of changes:

Bug Fixes

  • Fixes #350: added ‘six’ python dependency (#351)
  • fixes #323: Node keys are ignored when the field is a map (#330)
  • Fixes #337: Implemented user defined schema on Spark 2.4 and Spark 3
  • fixes #333: Remove Transaction code LockClientStopped (#336)
  • fixes #326: Schema retrieving by APOC doesn’t seem to work when node maps is true
  • fixes #315: wrong recursive call in type conversion (#321)

Enhancements

  • implemented BasePartitioReader in Spark 2.4 (#335)
  • Update java driver to 4.2.5 (#332)

Other/Infrastructure Enhancements

  • Fixed version typo on README.md
  • Update lint-links.js (#340)

Neo4j Connector for Apache Spark 4.0.1

06 Apr 13:36
86d2322
Compare
Choose a tag to compare

Neo4j Spark Connector

Hello Community we are pleased to announce the 4.0.1 release of the Spark Connector, we fixed several bugs and now we have the first iteration with the Support of Spark 3.x!

There are three files:

  • neo4j-connector-apache-spark_2.11-4.0.1_for_spark_2.4.jar for Spark 2.4 and Scala 2.11
  • neo4j-connector-apache-spark_2.12-4.0.1_for_spark_2.4.jar for Spark 2.4 and Scala 2.12
  • neo4j-connector-apache-spark_2.12-4.0.1_for_spark_3.jar for Spark 3.x and Scala 2.12

Please fill an issue if you have a problem!

For any question please refer to the Neo4j Community Forum

Following the full list of changes:

  • added spark packages zip generation script (#249)
  • Taskless: update README with correct package name (#248) …
  • fixes #193: Add testing against Neo4j 4.2 (#251)
  • updated 4.2 profile to official docker image (#258)
  • fixes #256: Improve quoting function (#257)
  • fixes #254: Returning a function from a query breaks in 3.5 (#255)
  • Fixes #261: Incorrect pushdown query for relationship (#268) …
  • Fixes #265: handle conversion to Neo4j map (#267) …
  • Update quick-java-example.adoc (#270) …
  • Update quick-java-example.adoc (#275)
  • Update aura.adoc (#287)
  • #206: Split current code into common and spark-2.4 modules (#282) …
  • #206 Add Spark 3.0 support (#283) …
  • fixes #286: improved relationship doc (#288)
  • Fixes #266: added test to ensure arrives to the Neo4j Driver (#290) …
  • Fix and check links with hyperlink (#271) …
  • Fix italic formatting of database option default (#294)
  • Fixes #281: Add Aura integration tests (#293) …
  • Fixes #291: remove build warnings (#292)
  • fixes #260: Move CI system to GitHub Actions (#301)
  • Fixes #213: Add Python tests (#296)
  • Fixes #299: Add docs for installing the connector (#305) …
  • Fixes #302: Test for Spark 3.1 (#304)
  • Fixes #284: leverage Spark Config as default configuration values (#306)
  • Add details on using Databricks with Aura (#303)
  • fixes #259: Neo4j 4.2.1 drivers (#307)
  • Fixes #285: Add version check for Spark (#308)
  • Taskless - Bump version (#311)
  • Fix JAR naming across docs and README (#312)
  • Update python-ci.yml (#317)

[PRE-RELEASE][4.0.1-pre1] Neo4j Connector for Apache Spark

22 Feb 09:26
0f58d2c
Compare
Choose a tag to compare

Neo4j Spark Connector

Hello Community we are pleased to announce the 4.0.1-pre1 release of the spark connector, we fixed several bugs and now we have the first iteration with the Support of Spark 3.0!

Please note that this is not meant to be used in a production env

There are three files:

  • neo4j-connector-apache-spark_2.11_2.4-4.0.1-pre1.jar for Spark 2.4 and Scala 2.11
  • neo4j-connector-apache-spark_2.12_2.4-4.0.1-pre1.jar for Spark 2.4 and Scala 2.12
  • neo4j-connector-apache-spark_2.12_3.0-4.0.1-pre1.jar for Spark 3.0 and Scala 2.12

Please test it and fill an issue if you have a problem!

See you soon with the official release!

Neo4j Connector for Apache Spark

10 Nov 15:07
f542df2
Compare
Choose a tag to compare
  • First major release of the Neo4j Connector for Apache Spark
  • Based around the DataSource API, with polyglot support
  • Supports (2.4.5 <= Spark < 3)
  • Supports Neo4j 3.5, 4.0, and 4.1

Second pre release of the Neo4j Connector for Apache Spark

16 Oct 08:20
2794c54
Compare
Choose a tag to compare

Neo4j Streams Kafka Connect (8)

Hello community!
We're happy to share the second pre-release of the Neo4j Connector for Apache Spark!

The connector supports Apache Spark starting from version 2.4.1.

You can find the documentation at the following link.

If you want to provide us feedback, please raise an issue.

In attachment you will find two files:

  • neo4j-spark-connector-2.11-4.0.0-pre2.jar which is for Scala 2.11
  • neo4j-spark-connector-2.12-4.0.0-pre2.jar which is for Scala 2.12

First pre release of the Neo4j Connector for Apache Spark

30 Sep 11:35
18372f0
Compare
Choose a tag to compare

Hello community!
We're happy to share the first pre-release of the Neo4j Connector for Apache Spark.

The connector supports Apache Spark starting from verson 2.4.1.

You can find the documentation at the following link.

If you want to provide us feedback, please raise an issue.

In attachment you will find two files:

  • neo4j-spark-connector-2.11-4.0.0-pre1.jar which is for Scala 2.11
  • neo4j-spark-connector-2.12-4.0.0-pre1.jar which is for Scala 2.12

Release 2.4.5-M2 for Scala 12

18 Aug 12:16
Compare
Choose a tag to compare

Thanks a lot to @conker84 @nielsdejong @utnaf for your contributions.

Contains the following fixes and improvements.

  • Updated saveGraph.updateNodes to use numPartitions instead of batch size.
  • Added correct numPartitions calculation for edges
  • fixes #133: Neo4jDataFrame#mergeEdgeList should skip node creation (#137)
  • fixes #138: Support for Scala 2.12.x (#146)
  • fixes #132: Supporting temporal data types using Neo4jDataFrame.mergeEdgeList (#136)
  • Updated saveGraph.updateNodes to use numPartitions instead of batch size.