java - Cant configure Spring Batch to run jobs consequently -


i use following job launcher launch spring batch jobs:

  <bean id="joblauncher" class="org.springframework.batch.core.launch.support.simplejoblauncher">     <property name="jobrepository" ref="jobrepository"/>     <property name="taskexecutor">       <bean class="org.springframework.core.task.simpleasynctaskexecutor"/>     </property>   </bean> 

and job definition

  <job id="bulkexportjob" restartable="false" xmlns="http://www.springframework.org/schema/batch">     <description>exports application pdf in bulk operation</description>      <step id="startexport" next="exportfiletree">       <description>do start export</description>       <tasklet ref="startbulkactiontasklet"/>     </step>      <step id="exportfiletree" next="zipfiletree">       <description>export application</description>       <tasklet>         <chunk reader="bulkactiontargetreader" writer="bulkexportfiletreewriter" commit-interval="1" skip-limit="100000000">           <skippable-exception-classes>             <!-- exceptions handled internally writer exception should not treated failures -->             <include class="java.lang.exception"/>           </skippable-exception-classes>         </chunk>       </tasklet>       <listeners>         <listener ref="promotionlistener"/>       </listeners>     </step>      <step id="zipfiletree" next="sendemail">       <description>creates zip file</description>       <tasklet ref="bulkexportzipwriter"/>     </step>      <step id="sendemail" next="finishexport">       <description>send notification email</description>       <tasklet ref="bulkexportsendnotification"/>     </step>       <step id="finishexport">       <description>finalise export</description>       <tasklet ref="finishbulkactiontasklet"/>     </step>    </job> 

my intention run 1 job @ time , queue other incoming jobs. log files can see jobs run in parallel. can see the code snippets don't have additional code make spring batch in parallel , yet still does. can please point doing wrong?

you using simpleasynctaskexecutor running jobs async , creating new thread each job:

taskexecutor implementation fires new thread each task, executing asynchronously.

supports limiting concurrent threads through "concurrencylimit" bean property. default, number of concurrent threads unlimited.

note: implementation not reuse threads! consider thread-pooling taskexecutor implementation instead, in particular executing large number of short-lived tasks.

as suggested if absolutely need simpleasynctaskexecutor can set concurrencylimit 1 (with throttle-limit="1" property) , have 1 job @ time, can use default synctaskexecutor run jobs sequentially , when 1 finishes other run (guess based on explanation want).


Comments

Popular posts from this blog

apache - PHP Soap issue while content length is larger -

asynchronous - Python asyncio task got bad yield -

javascript - Complete OpenIDConnect auth when requesting via Ajax -