Question

I am trying to find the best way to import all of our Lighthouse data (which I exported as JSON) into JIRA, which wants a CSV file.

I have a main folder containing many subdirectories, JSON files and attachments. The total size is around 50MB. JIRA allows importing CSV data so I was thinking of trying to convert the JSON data to CSV, but all convertors I have seen online will only do a file, rather than parsing recursively through an entire folder structure, nicely creating the CSV equivalent which can then be imported into JIRA.

Does anybody have any experience of doing this, or any recommendations?

Thanks, Jon

No correct solution

OTHER TIPS

The JIRA CSV importer assumes a denormalized view of each issue, with all the fields available in one line per issue. I think the quickest way would be to write a small Python script to read the JSON and emit the minimum CSV. That should get you issues and comments. Keep track of which Lighthouse ID corresponds to each new issue key. Then write another script to add things like attachments using the JIRA SOAP API. For JIRA 5.0 the REST API is a better choice.

We just went through a Lighthouse to JIRA migration and ran into this. The best thing to do is in your script, start at the top-level export directory and loop through each ticket.json file. You can then build a master CSV or JSON file to import into JIRA that contains all tickets.

In Ruby (which is what we used), it would look something like this:

Dir.glob("path/to/lighthouse_export/tickets/*/ticket.json") do |ticket|
  JSON.parse(File.open(ticket).read).each do |data|
    # access ticket data and add it to a CSV
  end
end
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top