Skip to content

πŸ’» Project to learn about Spring Batch Framework πŸƒ with 17 jobs implementation.

Notifications You must be signed in to change notification settings

PedroLucasOM/SpringBatchLearning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Logo: SpringBatchLearning

Version documentation maintenance Twitter: PedroLucasOM

πŸ’» Project to learn about Spring Batch Framwork πŸƒ with 17 jobs implementation.

Topics

  1. About SpringBatch
  2. About the Project
  3. Author
  4. Contributing
  5. Show your support
  6. License

1. About SpringBatch

It is a framework that uses the Java Virtual Machine and the Spring Ecosystem to build batch applications. By definition, batch systems are systems that realize a process of a finite amount of data without interaction or interruption.

To learn more about this framework, view this article on the Notion: SpringBatch Article

2. About the Project

Implemented Jobs

Bellow, you will see how implemented jobs work:

HelloWorldJob

Objective: This Job is responsible to write in the screen "Hello, World!". It is a basic Job that is configured to calls the Tasklet Step that execute this action.

View Code

JOB_NAME: helloWorld

EvenOrOddJob

Objective: This Job is responsible to print in the screen which specified numbers are even or odd.

It is configured to calls a Chunk Step that receive a Integer and returns a String. Is configured to process 10 records per transaction.

View Code

Show components
  • Reader: His reader is based in ItemReader and returns a fixed list of numbers with values of 0 till 10.

  • Processor: His processor is based in ItemProcessor and receive each number, verify if is even or odd and parse it to text with the following format: 'Item number Γ© par' or 'Item number Γ© impar'.

  • Writer: His writer is based in ItemWriter and print in the screen the returned results of the processor.


JOB_NAME: evenOrOdd

FixedLengthJob

Objective: This Job is responsible to read a flat file with a Client list in the fixed length format and write another file with the same records and format.

It is configured to calls a Chunk Step that receive a ItemReader and a ItemWriter, both are typed as Client. Is configured to process 1 record per transaction.

View Code

Show components
  • Reader: His reader is based in FlatFileItemReader and reads the fixed length file clients-fixed.txt defining the columns name, nickname, age, email and salaryRange with int ranges, making unmarshiling to Client and returning to the Step.

  • Writer: His writer is based in FlatFileItemWriter and writes a flat file in following path /files/output/fixedLength.txt with the same columns, records and format.


JOB_NAME: fixedLengthJob
GENERATED_FILE: fixedLength.txt

DelimitedFileJob

Objective: This Job is responsible to read a flat file with a Client list in the delimited format and write another file with the same records and format.

It is configured to calls a Chunk Step that receive a ItemReader and a ItemWriter, both are typed as Client. Is configured to process 1 record per transaction.

View Code

Show components
  • Reader: His reader is based in FlatFileItemReader and reads the delimited file clients-delimited.txt defining the reading columns name, nickname, age, email and salaryRange with the ',' like delimiter, making unmarshiling to Client and returning to the Step.

  • Writer: His writer is based in FlatFileItemWriter and writes a flat file in following path /files/output/delimitedFile.txt with the same columns, records and format, but using the ';' like delimiter.


JOB_NAME: delimitedFileJob
GENERATED_FILE: delimitedFile.txt

MultipleFormatsFileJob

Objective: This Job is responsible to read a flat file with a list of multiple records typed in the delimited format and print in the screen each record in your respective Java Object, calling the toString method.

It is configured to calls a Chunk Step that receive a FlatFileItemReader and a ItemWriter, both are typed as general Java Object. Is configured to process 1 record per transaction.

View Code

Show components
  • Reader: His reader is based in FlatFileItemReader and reads the delimited file with multiple formats clients-multiple-file1.txt, calling the lineMapper to do it.

  • LineMapper: The LineMapper called inside of Reader reads the lines and defines the type of each record according with your start column number. If it is 0, so it will be unpackage to Client. In case of the 1, it will be unpackage to Transaction. For both cases, the column properties are configured in their fieldSetMappers.

  • Writer: His writer is based in ItemWriter and print in the screen the returned results of the Reader that was delegated to LineMapper.


JOB_NAME: multipleFormatsFileJob

MultipleLineFileJob

Objective: This Job is responsible to read a flat file with a list of multiple records typed in the delimited format, to group transactions that are bellow each client and add them in a Transaction list insided in a Client Java Object and print in the screen all clients with your respective transactions.

It is configured to calls a Chunk Step that receive a FlatFileItemReader and a ItemWriter, both are typed as general Java Object. This Reader receive a FlatFileItemReader that calls a lineMapper that will pass each record to your specific type and returns the result. Is configured to process 1 record per transaction.

View Code

Show components
  • Reader: His reader is based in FlatFileItemReader and reads the delimited file with multiple formats clients-multiple-file1.txt, calling the lineMapper to do it.

  • Custom Reader: Is a reader that implements a ItemStreamReader and a ResourceAwareItemReaderItemStream, receive the FlatFileItemReader to call the read method and execute the business rule of put each bellow Transaction in a list of transactions insided in a Client.

  • LineMapper: The LineMapper called inside of the FlatFileItemReader reads the lines and defines the type of each record according with your start column number. If it is 0, so it will be unpackage to Client. If it is 1, it will be unpackage to Transaction. For both cases, the column properties are configured in their fieldSetMappers.

  • Writer: His writer is based in ItemWriter and print in the screen the returned results of the Reader that was delegated to LineMapper.


JOB_NAME: multipleLineFileJob

MultipleFileJob

Objective: This Job is responsible to read one or multiple flat files with a list of multiple records typed in the delimited format and print in the screen each record in your respective Java Object, calling the toString method.

It is configured to calls a Chunk Step that receive a MultiResourceItemReader and a ItemWriter, both are typed as general Java Object. This Reader receive a FlatFileItemReader that calls a lineMapper that will pass each record to your specific type and returns the result. Is configured to process 1 record per transaction.

View Code

Show components
  • Reader: His reader is based in MultiResourceItemReader that receive one or multiple flat files and delegates the reader to a FlatFileItemReader and reads delimited files with multiple formats, in this case: clients-multiple-file1.txt, clients-multiple-file2.txt, clients-multiple-file3.txt. After it, it calls the lineMapper and make this business rule.

  • LineMapper: The LineMapper called inside of FlatFileItemReader reads the lines and defines the type of each record according with your start column number. If it is 0, so it will be unpackage to Client. If it is 1, it will be unpackage to Transaction. For both cases, the column properties are configured in their fieldSetMappers.

  • Writer: His writer is based in ItemWriter and print in the screen the returned results of the Reader that was delegated to LineMapper.


JOB_NAME: multipleFileJob

CursorDataSourceJob

Objective: This Job is responsible to read data of a configured DataSource based in a native inputed query and print the returned data in the screen.

It is configured to calls a Chunk Step that receive a JdbcCursorItemReader and a ItemWriter typed as Client. Is configured to process 1 record per transaction.

View Code

Show components
  • Reader: His reader is based in JdbcCursorItemReader that receive a DataSource and a native select query to get the data. This reading way is based in get all data from DataSource and store it in the memory while the jobs is running.

  • Writer: His writer is based in ItemWriter and print in the screen the returned results.


JOB_NAME: cursorDataSourceJob

PaginatorDataSourceJob

Objective: This Job is responsible to read data of a configured DataSource based in a native sql mounted with a queryProvider and print the returned data in the screen.

It is configured to calls a Chunk Step that receive a JdbcPagingItemReader and a ItemWriter typed as Client. Is configured to process 1 record per transaction.

View Code

Show components
  • Reader: His reader is based in JdbcPagingItemReader that receive a DataSource and a queryProvider to mount the sql query with pagination and a sortKey to get the data. This reading way is based in get data from DataSource per page and repeat it for each chunk transaction.

  • Writer: His writer is based in ItemWriter and print in the screen the returned results.


JOB_NAME: paginatorDataSourceJob

BudgetStatementJob

Objective: This Job is responsible to read launchs of a configured Datasource based in a native sql, group them by budget statements and write each budget statement in a file in the following path: /files/output/${budgetStatement.codeNatureExpense}.

It is configured to calls a Chunk Step that receive a JdbcCursorItemReader typed as Launch, a custom writer and a footer writer. Is configured to process 1 record per transaction. Has too two listeners that can be called before writer call, after writer call, before chunk call and after chunk call.

View Code

Show components
  • Reader: His reader is based in JdbcCursorItemReader that receive a DataSource and a native select query to get the launchs. This reading way is based in get all launchs from DataSource and store it in the memory while the job is running.

  • Custom Reader: Is a reader that implements a ItemStreamReader, receive the FlatFileItemReader to call the read method and execute the business rule of categorize the returned launchs and returns the list of BudgetStatement.

  • Custom Writer: Is a writer that reads the list of BudgetStatement and, before each budget statement, create the respective file to that budget. Inside of his, there is a FlatFileItemWriter that is responsible to write the data in the created file, calling the headerCallback, footerCallback, suffixCreator and lineAggregator to format the received data.

  • Footer Callback: Is a listener that is called before each writing demand to count the number of records and put it on the footer when the writer is called.


JOB_NAME: budgetStatementJob
GENERATED_FILES: The generated files will have like name ${budgetStatement.codeNatureExpense}.txt

ValidatingJob

Objective: This Job is responsible to read a flat file with a Client list in the delimited format, valid these records verifying if the emails are duplicated and write the correct records in the screen.

It is configured to calls a Chunk Step that receive a FlatFileItemReader, a ItemProcessor and a ItemWriter, all typed as Client. Is configured to process 1 record per transaction.

View Code

Show components
  • Reader: His reader is based in FlatFileItemReader typed as Client and reads the delimited file clients-validating.txt defining the reading columns name, nickname, age, email and salaryRange with the delimiter ',', making unpackage to Client and returning to the Step.

  • Processor: His processor is based in ItemProcessor and, for each client, he verifies if the client email exists in the emails list of the class. If the verification returns true, it throws a ValidationException. If the verification returns false and the item pass in the verification, the client email is stored in the list of emails. For both cases, the cycle keep working untill process all records.

  • Writer: His writer is based in ItemWriter and print in the screen the returned results.


JOB_NAME: validatingJob

BeanValidatingJob

Objective: This Job is responsible to read a flat file with a Client list in the delimited format, valid these records verifying if the fields are corret according to BeanValidatingItemProcessor and write the correct records in the screen.

It is configured to calls a Chunk Step that receive a FlatFileItemReader, a ItemProcessor and a ItemWriter, all typed as Client. Is configured to process 1 record per transaction.

View Code

Show components
  • Reader: His reader is based in FlatFileItemReader typed as Client and reads the delimited file clients-bean-validating.txt defining the reading columns name, nickname, age, email and salaryRange with the ',' like delimiter, making unpackage to Client and returning to the Step.

  • Processor: His processor is based in ItemProcessor and, for each client, he verifies if the client is valid according with BeanValidatingItemProcessor. If the verification returns true, it will skip the client item. If the verification returns false and the item pass in the verification, the client item is returned. For both cases, the cycle keep working untill process all records.

  • Writer: His writer is based in ItemWriter and print in the screen the returned results.


JOB_NAME: beanValidatingJob

CompositeJob

Objective: This Job is responsible to read a flat file with a Client list in the delimited format, valid these records verifying if the emails are duplicated and if the fields are corret according to BeanValidatingItemProcessor. After it, it writes the correct records in the screen.

It is configured to calls a Chunk Step that receive a FlatFileItemReader, a ItemProcessor and a ItemWriter, all typed as Client. Is configured to process 1 record per transaction.

View Code

Show components
  • Reader: His reader is based in FlatFileItemReader typed as Client and reads the delimited file clients-composite.txt defining the reading columns name, nickname, age, email and salaryRange with the delimiter ',', making unpackage to Client and returning to the Step.

  • Processor: His processor is based in CompositeItemProcessor. This processor calls two anothers processors based in ItemProcessor and, for each client, he verifies if the client is valid according with BeanValidatingItemProcessor and if the client email exists in the emails list of the class. If the verification returns true, it will skip the client item. If the verification returns false and the item pass in the verification, the client item is returned. For both cases, the cycle keep working untill process all records.

  • Writer: His writer is based in ItemWriter and print in the screen the returned results.


JOB_NAME: compositeJob

ValidatingScriptJob

Objective: This Job is responsible to read a flat file with a Client list in the delimited format, valid these records verifying if the fields are corret according to ScriptItemProcessor and write the correct records in the screen.

It is configured to calls a Chunk Step that receive a FlatFileItemReader, a ItemProcessor and a ItemWriter, all typed as Client. Is configured to process 1 record per transaction.

View Code

Show components
  • Reader: His reader is based in FlatFileItemReader typed as Client and reads the delimited file clients-validating-script.txt defining the reading columns name, nickname, age, email and salaryRange with the delimiter ',', making unpackage to Client and returning to the Step.

  • Processor: His processor is based in ItemProcessor and, for each client, he verifies if the client is valid according with ScriptItemProcessor. It will execute a JavaScript script and analyze the return of this script. If the item is returned, the client item will be returned to Step scope. If the null value is returned, it will skip the client item. For both cases, the cycle keep working untill process all records.

  • Writer: His writer is based in ItemWriter and print in the screen the returned results.


JOB_NAME: validatingScriptJob

ClassifierJob

Objective: This job is responsible to read a flat file with a General Java Object list that records can be typed as Client or Transaction, classify these records and send each item to your respective processor - can be a ClientProcessor or a TransactionProcessor. After it, it prints returned records in the screen.

It is configured to calls a Chunk Step that receive a FlatFileItemReader, a ItemProcessor and a ItemWriter, all typed as General Java Object. Is configured to process 1 record per transaction.

View Code

Show components
  • Reader: His reader is based in FlatFileItemReader typed as General Java Object and reads the delimited file clients-classifier.txt, calling the lineMapper to do it and define the type of each item - if it is a Client or a Transaction. After it, the records are returned to Step scope.

  • Processor: His processor is based in ClassifierCompositeItemProcessor and, for each object, he verifies if the item is typed as Client or Transaction, parsing each type to your respective processor - can be a ClientProcessor or a TransactionProcessor.

  • Writer: His writer is based in ItemWriter and print in the screen the returned results of the Reader that was delegated to LineMapper.


JOB_NAME: classifierJob

BankAccountGenerateJob

Objective: This job is responsible to read a list of Client in a configured DataSource, process these records and classify by salaryRange and build a BankAccount. After it, the writer build a flat file with the valid accounts and persist it in the configured DataSource.

It is configured to calls a Chunk Step that receive a JdbcPagingItemReader, a ClassifierCompositeItemProcessor, a ClassifierCompositeItemWriter and two @Qualifiers - to define the bean that will be injected - with two writers typed as FlatFileItemWriter to be call in the stream() - makes the writers typed as FlatFileItemWriter explicit in the call, since the ClassifierCompositeItemWriter isn't a instance of ItemStream.

The BankAccountType can be:

  • PRATA
  • OURO
  • PLATINA
  • DIAMANTE
  • INVALID (case the client has some wrong information)

View Code

Show components
  • Reader: His reader is based in JdbcPagingItemReader that receive a DataSource with @Qualifier and a pagingQueryProvider to mount the sql query with pagination and a sortKey - in this case, the email - to get the list typed as Client. This reading way is based in get data from DataSource per page and repeat it for each chunk transaction.

  • Processor: His processor is based in ClassifierCompositeItemProcessor that calls as classifier the custom processor BankAccountGenerateClassifier.

  • Custom Processor: His custom processor implements a Classifier to return to the ClassifierCompositeItemProcessor. It is responsible to make a HashMap with the BankAccountType and your respective processor. After it, it calls the returnFromSalaryRange() method of BankAccountType parsing the salaryRange and, according this range, return the respective BankAccountType. Finally, it gets in the HashMap previously configured to the respective processor and run it to make the BankAccount.

  • Classifier Writer: His writer is based in ClassifierCompositeItemWriter that calls the method classifier(). This method verifies if the generated BankAccount has a instanciated BankAccountType. If it is true, the classifier calls the Composite Account Writer. If it is false, something was wrong with the Client and the Invalid Client Writer is called.

  • Invalid Client Writer: His custom writer takes the invalid clients and make a flat file with the FlatFileItemWriter in the following path /files/output/invalidClients.txt.

  • Composite Account Writer: His custom writer is typed as CompositeItemWriter and make a flat file with FlatFileItemWriter. This flat file has bankAccountType, maxLimit and clientId as columns. The generated file is putted in the following path: /files/output/bankAccounts.txt. At same time that this happens, these records are persited in a configured DataSource with the JdbcBatchItemWriter.


JOB_NAME: bankAccountGenerateJob
GENERATED_FILES: bankAccounts.txt and invalidClients.txt
SELECT_QUERY: select * from app_batch.bank_account;

PointSheetJob

Objective: This job is responsible to read a list of Employee of a configured DataSource, process it verifying if each employee doesn't have any pointRegistry - if true, a invalid PointSheet is builded with only registrationCode field and returned to writer, if false, a PointSheet is builded with all fields and returned to writer too. After that, the writer is called following the following rule: if the item is a invalid PointSheet, a file is build with the registrationCode of the list of Employee. Otherwise, a formatted file is build with the correct list of PointSheet at same time they are persisted in the configured DataSource.

It is configured to calls a Chunk Step that receive a JdbcCursorItemReader, a ClassifierCompositeItemProcessor, a ClassifierCompositeItemWriter and two @Qualifiers - to define the bean that will be injected - with two writers. One is typed as FlatFileItemWriter and another is typed as CompositeItemWriter. Both are called in the stream() - makes the writers typed as FlatFileItemWriter and CompositeItemWriter explicit in the call, since the ClassifierCompositeItemWriter isn't a instance of ItemStream.

View Code

Show components
  • Reader: His reader is based in JdbcCursorItemReader that receive a DataSource and executes a query to get a list of Employee, making left join with the table where registrationPoints are saved to repeat the Employee with the date for each registrationDate.

  • Custom Reader: His custom reader implements a ItemStreamReader, read the returned list of Employee from JdbcCursorItemReader and agroup each duplicated Employee in a another Employee with all dates of registrationPoints.

  • Processor: His processor is based in ClassifierCompositeItemProcessor that calls as classifier the custom processor PointSheetClassifier.

  • Custom Processor: His custom processor implements a Classifier to return to the ClassifierCompositeItemProcessor. It is responsible to verify if some Employee doesn't have any pointRegistry. If it returns true, the InvalidPointSheetProcessor is called and a invalid PointSheet is builded. Otherwise, PointSheetProcessor is called and the object is build correctly.

  • Writer: His writer is based in ClassifierCompositeItemWriter and calls a Classifier to choice whitch writer must be called according to the returned PointSheet, verifying if he is valid or invalid. If valid, calls the PointSheetComposite Writer. Otherwise, it calls the employeeWithoutPointSheetFile Writer.

  • PointSheetComposite Writer: His custom writer is responsible to persist in the configured DataSource each PointSheet returned correctly at same time that builds a file with a custom format of PointSheet.

  • EmployeeWithoutPointSheetFile Writer: His custom writer is responsible to build a file with the returned registrations of all invalid Employee.


JOB_NAME: pointSheetJob
GENERATED_FILES: pointSheet.txt and employeeWithoutPointSheet.txt
SELECT_QUERY: select * from app_batch.point_sheet;

Prerequisites

  • docker

Configuration

To run a specific job, you need to set a envionment variable called JOB_NAME with value of the job that you want execute.

Windows

In the Command Prompt, run:

set JOB_NAME=jobName

Linux

In the terminal, run:

export JOB_NAME=jobName

Mac

In the terminal, run:

touch ~/.bash_profile; open ~/.bash_profile

In TextEdit, add:

export JOB_NAME=jobName

Save the .bash_profile file and Quit (Command + Q) Text Edit.

Run

With the docker started, execute this command at the project root:

docker-compose up -d --build

Usage

To see the changes of the application, you can use these following ways according with the job:

Seeing results in the log

In the base directory of the project with application running in the docker, run:

docker-compose logs -f -t app

These are the jobs that print your results in the log:

Seeing results in the file

In the base directory of the project, navigate to /files/output. There will are all generate files from jobs.

OBS.: You can view the output filename in each implemented job description above.

These are the jobs that build a file with your results:

Seeing results in the database

In the base directory of the project with the application running in the docker, run:

docker-compose exec database_app mysql -u root -papp#1234 -e "select ..."

OBS.: You can view the query in each implemented job description above and put it instead of "select ...".

These are the jobs that persist your results in the database:

Stop

To stop correctly:

docker-compose down -v

Remember to execute this command each time that you want change the parameter value.

3. Author

πŸ‘€ Pedro Lucas

4. Contributing 🀝

Contributions, issues and feature requests are welcome!
Feel free to check issues page.

5. Show your support

Give a ⭐ if this project helped you!

6. License πŸ“

Copyright Β© 2021 Pedro Lucas.