Question

I am new to mongodb and want to know about importing a json file from one server to another. I tried the following command mongoimport -d test -c bik check.json and it works fine for me. Now i want to know when there are multiple json files how do i import all of them at a single go. I could not find any related document where it is written this is not possible. Please help me is this possible and how

Was it helpful?

Solution

You can always write some shell scripts.

colls=( mycoll1 mycoll2 mycoll5 )

for c in ${colls[@]}
do
  mongoimport -d mydb -c $c.json
done

OTHER TIPS

I came up with a more elegant way to automatically import ALL collections:

ls -1 *.json | sed 's/.json$//' | while read col; do 
    mongoimport -d db_name -c $col < $col.json; 
done

I hope this is helpful.

Windows Batch version:

@echo off
for %%f in (*.json) do (
    "mongoimport.exe" --jsonArray --db databasename --collection collectioname --file %%~nf.json
)

You can do it by this way also :

for filename in *; do mongoimport --db <Database> --collection <Collection Name> --file $filename; done

This worked for me in MAC OS X

find . -regex '.*/[^/]*.json' | xargs -L 1 mongoimport --db DB_NAME -u USER_NAME -p PASSWORD --collection COLLECTION_NAME  --file

For windows bat file. This would be way better if you have a list of json files in the folder. and the collection name matches the name in files

@echo off
for %%f in (*.json) do (
    "mongoimport.exe" --db databasename --collection %%~nf --drop --file %%f
)
pause

Another one line solution (assuming you are in the folder where the json files are):

ls | sed 's/.json$//' | xargs -I{} mongoimport -d DATABASE_NAME -c {} {}.json

One line solution:

for /F %i in ('dir /b c:\files\*.json') do mongoimport.exe /d db /c files /file c:\file\%i

I'm going to show how to import many collections efficiently using only the Linux's terminal (it also works in Mac).

You must have all json files at the same folder and the file's name should be the collection that will be imported to your database.

So, let's begin, open the folder that contains your json files. Replace the <DATABASE> to your database name, then execute the line below:

for collection in $(ls | cut -d'.' -f1); do mongoimport --db <DATABASE> --collection ${collection} --file ${collection}.json; done

But what is going on there?

First of all, you have to keep in mind that the parentheses will be executed first. In this case, it creates a list of all files getting just the name of each file (removing it's extension).

Secondly, all list will be added to a loop "for" in a local variable called collection (this variable's name could be anything you want)

Thirdly, the "do" execute the import line(*)

Finally the "done", finish the loop.

(*) The import line is composed by "mongoimport" that requires the database name "--db", the collection name "--collection", and the file name "--file". These requirements has been filled by the variable "$collection" created on the "for" stuff

I hope helped someone! Good luck guys :)

I used the solutions here to add a shell function to my bash profile for doing this quickly.

My example depends on the mongo export outputting each collection as a file with the collection name and .metadata.json extension.

function mimport() { for filename in *; do collection="${filename%.metadata.json}"; mongoimport --db $1 --collection $collection --file $filename; done }

Use in the path of the export files, passing the DB name to the command...

mimport my_db

Will load all collections into the DB at localhost.

Linux:

> cat one.json two.json > three.json

> mongoimport --db foo --collection baz --file three.json"

Or, all files in the folder :

> cat *.json > big.json

> mongoimport --db foo --collection baz --file "big.json"

Not sure whether it's a new feature, but mongoimport now can actually read from stdin. So what one can do to import multiple JSON files is as simple as

cat *.json | mongoimport --uri "mongdb://user:password@host/db?option=value" --collection example

I'm using mongodb-tools v4.2.0 btw.

UPDATE

mongodbimport can potentially consume a high amount of memory which may cause the program to be kill by system OOM. My machine's got 32GB RAM and this happened consistently when I tried to import ~10GB of data which is stored in RAM disk.

To divide a relatively large job into batches:

#!/usr/bin/env bash

declare -a json_files=()
for f in *.json; do
    json_files+="$f"
    if [[ "${#json_files[@]}" -ge 1000 ]]; then
        cat "${json_files[@]}" | mongoimport --uri="mongodb://user:pass@host/db" --collection=examples -j8 #--mode=upsert --upsertFields=id1
        json_files=()
    fi
done
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top