문제

I'm looking to write a reporting tool. The data resides in a ~6GB postgresql database. The application is an online store/catalog application that has items and orders. The stakeholders are requesting a feature that will allow them to search for an item and give a count of all those orders in the last 2 years.

Some rows contain quantities, and units of measure, which would require multiplication of quantity and UoM for each row.

It's also possible that other reporting functions will be necessary in the future.

I have not delved much into the data analysis aspect of programming. I enjoy Clojure, so I would be thrilled to find a solution that uses Clojure, but only if Clojure offers competitive tools for my needs.

Here are some options I'm considering:

  • merely SQL
  • Clojure
    • core.reducers
    • a clojure hadoop library
  • Hadoop

Can anyone shed some insight into these kinds of problems for me? Are there articles that you would recommend?

도움이 되었습니까?

해결책

Hadoop is likely overkill for this project. It seems most likely that simply using Clojure-jdbc or Korma to read the data form the database and filter/reduce it in Clojure is likely to be fine. At work we routinely work with sequences of that size, though this depends on the expected response time. You may need to do some preprocessing and caching if instantaneous responses are expected.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top