The MongoDB shell version is listed at the top. Mongoexport -host $HOST -u $USERNAME -p $PASSWORD -db $DB -c $collection -jsonArray -o $collection. The help list is quite long, so I won’t output it all here. Mongoimport $ARGS -d $DB -c $collection $pathĭATABASE_COLLECTIONS=$(mongo $CONN $ARGS -quiet -eval 'db.getCollectionNames()' | sed 's/,/ /g')įor collection in $DATABASE_COLLECTIONS do If you want to use mongoexport and mongoimport to export/import each collection from database, I think this utility can be helpful for you. There are other acceptable solutions above, but this Unix pipeline is relatively short and sweet: mongo -quiet mydatabase -eval "db.getCollectionNames().join('\n')" | \ And for that I wanted the answer to the original question. But apparently parallel computing with Pymongo / MongoDB causes trouble for Unix systems (. I want to parallelize the documents download + processing steps (processing is long, and I do not have enough RAM nor memory to download the entire database at once). However, I needed a quick and dirty solution that would likely be forwards and backwards compatible between old and new versions of MongoDB, provided there's nothing especially wacky going on. I am analyzing a MongoDB database in Python via Pymongo. I realize that this is quite an old question and that mongodump/mongorestore is clearly the right way if you want a 100% faithful result, including indexes.