Uploaded image for project: 'Spring Batch'
  1. Spring Batch
  2. BATCH-1799

Exception in flush of file output ItemWriters does not abort a step/job

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Minor
    • Resolution: Complete
    • Affects Version/s: None
    • Fix Version/s: 2.2.0, 2.2.0 - Sprint 6
    • Component/s: Infrastructure
    • Labels:
      None
    • Environment:
      reproduced with [SpringBatch 2.1.5/ Spring 3.0.2] and [SpringBatch 2.1.8/ Spring 3.0.2].

      Description

      Scenario:
      Using a FlatFileItemWriter to write into a file on full diks/memorystick. (note: there must be enough space on the disk/memorystik to create the file during the call of open() ).

      What would I expect:
      The step and also the job should fail, since the data could not be written into the fail, because missing space.

      What happens:
      The IOException is simply logged, but the step is not failing.

      What is the result:
      The written file is corrupt, since not complete. A restart is not possible, since the failing step actually ends with state COMPLETED.

      What causes the problem:
      Described in http://forum.springsource.org/showthread.php?115739-DiskFull-IOException-does-not-result-in-a-failed-job-when-writing-to-a-file

        Attachments

          Activity

            People

            • Assignee:
              mminella Michael Minella
              Reporter:
              hansjoerg Hansjoerg Wingeier
            • Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: