NoSQL Zone is brought to you in partnership with:

Michael Hunger has been passionate about software development for a long time. He is particularly interested in the people who develop software, software craftsmanship, programming languages, and improving code. For the last few years he has been working with Neo Technology on the Neo4j graph database. As the project lead of Spring Data Neo4j he helped developing the idea to become a convenient and complete solution for object graph mapping. He is also taking care of Neo4j cloud hosting efforts. Michael now takes care of the Neo4j community in all regards and is involved with activities in all parts of the company. Michael is a DZone MVB and is not an employee of DZone and has posted 13 posts at DZone. You can read more from them at their website. View Full User Profile

Neo4j & Cypher: Cleaning Out Your Graph

03.29.2014
| 3775 views |
  • submit to reddit

If you want to delete lots of data from a Neo4j database with Cypher:

Just stop the server and delete the directory and start again

Fastest way with no leftovers, just delete db/data/graph.db and you’re done.

Cypher Statement before 2.1

“Unknown Error” or OutOfMemoryException is a symptom that your transaction size gets too big and consumes too much memory.

That is unrelated to your config, you just have to keep it in check.

If you want to delete elements in a batched way use something like this:

MATCH (a)
WITH a
LIMIT 10000
OPTIONAL MATCH (a)-[r]-()
DELETE a,r
RETURN COUNT(*)

Run until the result stays 0. This query will find at most 10000 nodes then find all their rels and then delete both. But how would you do it in Neo4j 2.1 ?

Cypher 2.1 and beyond

You can make use of the new PERIODIC COMMIT feature that takes care of batching behind the scenes. Try this:

USING PERIODIC COMMIT
MATCH (a)
OPTIONAL MATCH (a)-[r]-()
DELETE a,r;


Published at DZone with permission of Michael Hunger, author and DZone MVB. (source)

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)