Skip to content

Releases: SETL-Framework/setl

SETL-1.0.0-RC2

30 Mar 06:17
3158cf4
Compare
Choose a tag to compare
SETL-1.0.0-RC2 Pre-release
Pre-release

BREAKING CHANGE:

  • Change group id to io.github.setl-framework

SETL-1.0.0-RC1

21 Aug 11:47
Compare
Choose a tag to compare
SETL-1.0.0-RC1 Pre-release
Pre-release

New Features:

  • Added Spark 3.0 support

Fixes:

  • Fixed save mode in DynamoDB Connector

SETL-0.4.3

10 Jul 14:35
Compare
Choose a tag to compare

Changes:

  • Updated spark-cassandra-connector from 2.4.2 to 2.5.0
  • Updated spark-excel-connector from 0.12.4 to 0.13.1
  • Updated spark-dynamodb-connector from 1.0.1 to 1.0.4
  • Updated scalatest (scope test) from 3.1.0 to 3.1.2
  • Updated postgresql (scope test) from 42.2.9 to 42.2.12

New Features:

  • Added pipeline dependency check before starting the spark job
  • Added default Spark job group and description
  • Added StructuredStreamingConnector
  • Added DeltaConnector
  • Added ZipArchiver that can zip files/directories

Fixes

  • Fixed path separator in FileConnectorSuite that cause test failure
  • Fixed Setl.hasExternalInput that always returns false

SETL-0.4.2

15 Feb 12:42
Compare
Choose a tag to compare

Fix cross compile issue (#111)

SETL-0.4.1

13 Feb 15:06
Compare
Choose a tag to compare

Changes:

  • Changed benchmark unit of time to seconds (#88)

Fixes:

  • The master URL of SparkSession can now be overwritten in local environment (#74)
  • FileConnector now lists path correctly for nested directories (#97)

New features:

  • Added Mermaid diagram generation to Pipeline (#51)
  • Added showDiagram() method to Pipeline that prints the Mermaid code and generates the live editor URL 🎩🐰✨ (#52)
  • Added Codecov report and Scala API doc
  • Added delete method in JDBCConnector (#82)
  • Added drop method in DBConnector (#83)
  • Added support for both of the following two Spark configuration styles in SETL builder (#86)
    setl.config {
      spark {
        spark.app.name = "my_app"
        spark.sql.shuffle.partitions = "1000"
      }
    }
    
    setl.config_2 {
      spark.app.name = "my_app"
      spark.sql.shuffle.partitions = "1000"
    }

Others:

  • Improved test coverage

v0.4.0

09 Jan 22:19
0b486a2
Compare
Choose a tag to compare

Changes:

  • BREAKING CHANGE: Renamed DCContext to Setl
  • Changed the default application environment config path into setl.environment
  • Changed the default context config path into setl.config

Fixes:

  • Fixed issue of DynamoDBConnector that doesn't take user configuration
  • Fixed issue of CompoundKey annotation. Now SparkRepository handles correctly columns having
    multiple compound keys. (#36)

New features:

  • Added support for private variable delivery (#24)
  • Added empty SparkRepository as placeholder (#30)
  • Added annotation Benchmark that could be used on methods of an AbstractFactory (#35)

Others:

  • Optimized DeliverableDispatcher
  • Optimized PipelineInspector (#33)