'Spring: Generic RowMapper for dynamic queries

I am using SpringBatch to read from Oracle and write into ElasticSearch.

My code works fine for static queries. Example: select emp_id, emp_name from employee_table I have a RowMapper class that maps the values from resultSet with the Employee POJO.

My requirement is

The query will be input by the user. So the query might be as follows

  1. select emp_id, emp_name from employee_table
  2. select cust_id, cust_name, cust_age from customer_table
  3. select door_no, street_name, loc_name, city from address_table
  4. Similar queries

My questions are

  1. Is there a way to dynamically create a POJO according to the query given by the user?
  2. Will the RowMapper concept work if the query keeps changing as in my case?
  3. Is there something like a generic rowmapper?

Sample code would be much appreciated.



Solution 1:[1]

If you have objects you need to map to...

Consider aliasing your SQL to match your object field names using a custom implementation of RowMapper which actually extends BeanWrapperFieldSetMapper

So if your POJO looks like this:

public class Employee {

    private String employeeId;
    private String employeeName;

    ...
    // getters and setters

}

Then your SQL can look like this:

SELECT emp_id employeeId, emp_name employeeName from employee_table

Then your wrapped RowMapper would look something like this:

import org.springframework.jdbc.core.RowMapper
import org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper

public class BeanWrapperRowMapper<T> extends BeanWrapperFieldSetMapper<T> implements RowMapper<T> {

    @Override
    public T mapRow(final ResultSet rs, final int rowNum) throws SQLException {
        final FieldSet fs = getFieldSet(rs);
        try {
            return super.mapFieldSet(fs);
        } catch (final BindException e) {
            throw new IllegalArgumentException("Could not bind bean to FieldSet", e);
        }
    }

    private FieldSet getFieldSet(final ResultSet rs) throws SQLException {
        final ResultSetMetaData metaData = rs.getMetaData();
        final int columnCount = metaData.getColumnCount();

        final List<String> tokens = new ArrayList<>();
        final List<String> names = new ArrayList<>();

        for (int i = 1; i <= columnCount; i++) {
            tokens.add(rs.getString(i));
            names.add(metaData.getColumnName(i));
        }

        return new DefaultFieldSet(tokens.toArray(new String[0]), names.toArray(new String[0]));    
    }

}

Alternatively...

If you don't have any POJOs to map to, use the out-of-box ColumnMapRowMapper to get get back a map (Map<String,Object>) of column names (let's call them COL_A, COL_B, COL_C) to values. Then if your writer is something like a JdbcBatchItemWriter you can set your named parameters as:

INSERT TO ${schema}.TARGET_TABLE (COL_1, COL_2, COL_3) values (:COL_A, :COL_B, :COL_C)

and then your ItemSqlParameterSourceProvider implementation could look like so:

public class MapItemSqlParameterSourceProvider implements
        ItemSqlParameterSourceProvider<Map<String, Object>> {

    public SqlParameterSource createSqlParameterSource(Map<String, Object> item) {
        return new MapSqlParameterSource(item);
    }

}

Solution 2:[2]

To answer your questions:

  1. Is there a way to dynamically create a POJO based on the user's query - Even if there was, I'm not sure how much help it would be. For your use case, I'd suggest just using a Map.
  2. Will the RowMapper concept work if the query keeps changing - If you use a Map, you can use the column names as the keys and the column values as the values. You should be able to create a RowMapper implementation that can do this.
  3. Is there something like a generic RowMapper - There is but it's intended for POJO's so you'd need to create your own for this.

Solution 3:[3]

I found a solution to my problem by using Spring's ColumnMapRowMapper. Please find a snippet from the xml configuration file. I didn't generate any POJO class. I managed with a Map and inserted the same into ES. The map's key name should match with the field names present in index.

<step id="slave" xmlns="http://www.springframework.org/schema/batch">
    <tasklet>
        <chunk reader="pagingItemReader" writer="elasticSearcItemWriter"
            processor="itemProcessor" commit-interval="10" />
    </tasklet>
</step>



<bean id="pagingItemReader"
        class="org.springframework.batch.item.database.JdbcPagingItemReader"
        scope="step">
        <property name="dataSource" ref="dataSource" />
        <property name="queryProvider">
            <bean
                class="org.springframework.batch.item.database.support.SqlPagingQueryProviderFactoryBean">
                <property name="dataSource" ref="dataSource" />
                <property name="selectClause" value="*******" />
                <property name="fromClause" value="*******" />

                <property name="whereClause" value="*******" />
                <property name="sortKey" value="*******" />
            </bean>

        </property>
        <!-- Inject via the ExecutionContext in rangePartitioner -->
        <property name="parameterValues">
            <map>
                <entry key="fromId" value="#{stepExecutionContext[fromId]}" />
                <entry key="toId" value="#{stepExecutionContext[toId]}" />
            </map>
        </property>
        <property name="pageSize" value="10" />
        <property name="rowMapper">
            <bean class="org.springframework.jdbc.core.ColumnMapRowMapper" />
        </property>
    </bean>

And inside my elasticSearcItemWriter class....

public class ElasticSearchItemWriter<T> extends AbstractItemStreamItemWriter<T>
        implements ResourceAwareItemWriterItemStream<T>, InitializingBean {
....
....
....
@Override
    public void write(List<? extends T> items) throws Exception {


        client = jestClient.getJestClient();
        if (items.size() > 0) {
            for (Object item : items) {
                @SuppressWarnings("unchecked")
                Map<String, Object> map = (Map<String, Object>) item;
                // Asynch index
                Index index = new Index.Builder(map).index(Start.prop.getProperty(Constants.ES_INDEX_NAME))
                    .type(Start.prop.getProperty(Constants.ES_INDEX_TYPE)).build();


                client.executeAsync(index, new JestResultHandler<JestResult>() {
                    public void failed(Exception ex) {
                    }

                    public void completed(JestResult result) {
                    }
                });
            }
        }
    }
.....
....
}

Solution 4:[4]

You can do it simply like below,

SettingsDto settings = SettingsDao.getById(1, new BeanPropertyRowMapper<>(SettingsDto.class));

In a generic way, you can pass your DTO class, but please note you have to use same name as SQL columns or have to use ALIAS in SQL query according to the DTO.

@Data
public class SettingsDto {

    private int id;

    private int retryCount;

    private int batchSize;

    private int retryPeriod;

    private int statusInitialDelay;

}

My dao method is below

SettingsDto getById(int id, final RowMapper<OMoneySettingsDto> mapper);

its implementation is below,

 @Override
    public SettingsDto getById(final int id, final RowMapper<OMoneySettingsDto> mapper) {
        return new JdbcTemplate(YourDataSource).queryForObject(QUERY_SETTINGS_BY_ID,new Object[]{id}, mapper);
}

SQL is here, as below you have to use same name in the DTO

private static final String OMONEY_SETTINGS_BY_ID = "SELECT AS id,retry_count AS retryCount FROM setttings WHERE id = ?";

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2 Michael Minella
Solution 3 Andy
Solution 4 Rajitha Bhanuka