Spring Batch:Step Partitioning OutOfMemory错误
我有一个分区 spring 批处理,我使用 JdbcPagingItemReader 来读取块和处理中的数据 .
该批处理正常,有3000条记录 . 但如果负载增加到6000,那么我们经常会遇到OutOfMemoryError
Exit FailureExeptions : [java.lang.OutOfMemoryError, org.springframework.batch.core.JobExecutionException: Partition handler returned an unsuccessful step, java.lang.OutOfMemoryError, java.lang.OutOfMemoryError, java.lang.OutOfMemoryError]
批次的平均负载为3000条记录,但在特殊情况下最大可为6000条记录 . 我们正在测试我们的批次有6000条记录 .
Current Activity:
1)我们正在尝试获取堆转储以进一步分析问题 .
2)寻找其他选项,如增加堆大小 . 现有设置最小值为128 MB,最大值为512 MB
The job xml file:
<import resource="../config/batch-context.xml" />
<import resource="../config/database.xml" />
<job id="partitionJob" xmlns="http://www.springframework.org/schema/batch">
<step id="masterStep" parent="abstractPartitionerStagedStep">
<partition step="slave" partitioner="rangePartitioner">
<handler grid-size="5" task-executor="taskExecutor" />
</partition>
</step>
</job>
<bean id="abstractPartitionerStagedStep" abstract="true">
<property name="listeners">
<list>
<ref bean="updatelistener" />
</list>
</property>
</bean>
<bean id="updatelistener" class="com.test.springbatch.model.UpdateFileCopyStatus" />
<step id="slave" xmlns="http://www.springframework.org/schema/batch">
<tasklet>
<chunk reader="pagingItemReader" writer="flatFileItemWriter"
processor="itemProcessor" commit-interval="1" retry-limit="0"
skip-limit="100">
<skippable-exception-classes>
<include class="java.lang.Exception" />
</skippable-exception-classes>
</chunk>
</tasklet>
</step>
<bean id="rangePartitioner" class="com.test.springbatch.partition.RangePartitioner">
<property name="dataSource" ref="dataSource" />
</bean>
<bean id="taskExecutor" class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="5" />
<property name="maxPoolSize" value="5" />
<property name="queueCapacity" value="100" />
<property name="allowCoreThreadTimeOut" value="true" />
<property name="keepAliveSeconds" value="60" />
</bean>
<bean id="itemProcessor" class="com.test.springbatch.processor.CaseProcessor" scope="step">
<property name="threadName" value="#{stepExecutionContext[name]}" />
</bean>
<bean id="pagingItemReader" class="org.springframework.batch.item.database.JdbcPagingItemReader" scope="step">
<property name="dataSource" ref="dataSource" />
<property name="saveState" value="false" />
<property name="queryProvider">
<bean class="org.springframework.batch.item.database.support.SqlPagingQueryProviderFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="selectClause" value="SELECT *" />
<property name="fromClause" value="FROM ( SELECT CASE_NUM ,CASE_STTS_CD, UPDT_TS,SBMT_OFC_CD, SBMT_OFC_NUM,DSTR_CHNL_CD,APRV_OFC_CD,APRV_OFC_NUM,SBMT_TYP_CD, ROW_NUMBER() OVER(ORDER BY CASE_NUM) AS rownumber FROM TSMCASE WHERE PROC_IND ='N' ) AS data" />
<property name="whereClause" value="WHERE rownumber BETWEEN :fromRow AND :toRow " />
<property name="sortKey" value="CASE_NUM" />
</bean>
</property>
<property name="parameterValues">
<map>
<entry key="fromRow" value="#{stepExecutionContext[fromRow]}" />
<entry key="toRow" value="#{stepExecutionContext[toRow]}" />
</map>
</property>
<property name="pageSize" value="100" />
<property name="rowMapper">
<bean class="com.test.springbatch.model.CaseRowMapper" />
</property>
</bean>
<bean id="flatFileItemWriter" class="com.test.springbatch.writer.FNWriter" scope="step" />
My Questions:
-
作业运行良好,有3000条记录,那么为什么如果加载是6000条记录就会抛出OOM错误,即使我正在使用面向块的处理? JdbcPagingItemReader是否在内部缓存任何内容?
-
我的工作配置看起来不错吗?工作配置是否有任何改进的余地?
回答(0)