How to read CSV file and insert data into PostgreSQL using Mule ESB, Mule Studio

StackOverflow https://stackoverflow.com/questions/13911296

  •  10-12-2021
  •  | 
  •  

Question

I am very new to Mule Studio.

I am facing a problem. I have a requirement where I need to insert data from a CSV file to PostgreSQL Database using Mule Studio.

I am using Mule Studio CE (version: 1.3.1). I check ed in the Google and find that we can use Data-mapper for doing so. But it works only for EE .So I cannot use it.

Also I am checking in the net and found an article Using Mule Studio to read Data from PostgreSQL(Inbound) and write it to File (Outbound) - Step by Step approach.

That seems feasible but my requirement is just the opposite of the article given. I need File as Inbound data while Databse as Outbound component.

What is the way to do so?

Any step by step help (like what components to use) and guidance will be greatly appreciated.

Était-ce utile?

La solution

Here is an example that inserts a two columns CSV file:

<configuration>
    <expression-language autoResolveVariables="true">
        <import class="org.mule.util.StringUtils" />
        <import class="org.mule.util.ArrayUtils" />
    </expression-language>
</configuration>

<spring:beans>
    <spring:bean id="jdbcDataSource" class=" ... your data source ... " />
</spring:beans>

<jdbc:connector name="jdbcConnector" dataSource-ref="jdbcDataSource">
    <jdbc:query key="insertRow"
        value="insert into my_table(col1, col2) values(#[message.payload[0]],#[message.payload[1]])" />
</jdbc:connector>

<flow name="csvFileToDatabase">
    <file:inbound-endpoint path="/tmp/mule/inbox"
        pollingFrequency="5000" moveToDirectory="/tmp/mule/processed">
         <file:filename-wildcard-filter pattern="*.csv" />
    </file:inbound-endpoint>

    <!-- Load all file in RAM - won't work for big files! -->
    <file:file-to-string-transformer />
    <!-- Split each row, dropping the first one (header) -->
    <splitter
        expression="#[rows=StringUtils.split(message.payload, '\n\r');ArrayUtils.subarray(rows,1,rows.size())]" />
    <!-- Transform CSV row in array -->
    <expression-transformer expression="#[StringUtils.split(message.payload, ',')]" />
    <jdbc:outbound-endpoint queryKey="insertRow" />
</flow>

Autres conseils

In order to read CSV file and insert data into PostgreSQL using Mule all you need to follow following steps: You need to have following things as pre-requisite

  • PostgreSQL
  • PostgreSQL JDBC driver
  • Anypoint Studio IDE and
  • A database to be created in PostgreSQL

Then configure Postgre SQL JDBC Driver in Global Element Properties inside Studio Create Mule Flow in Anypoint Studio as follows:

  • Step 1: Wrap CSV file source in File component
  • Step 2: Convert between object arrays and strings
  • Step 3: Split each row
  • Step 4: Transform CSV row in array
  • Step 5: Dump into the destination Database

I would like to suggest Dataweave.

Steps

  1. read the file using FTP connector / endpoint.

  2. Transform using Data weave.

  3. Use database connector , store the data in DB.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top