Migration guide
To see the biggest differences please consult the changelog.
In version 1.8.0
Scenario authoring changes
- #3924
- Fixup:
{}
is now interpreted as "allow everything schema", not as "object schema". Objects schemas have to have declared"type": "object"
. - Unknown is now allowed on sinks in both validation modes if output schema is "everything allowed schema".
- Fixup:
Code API changes
- #3924 - changes to
SwaggerTyped
hierarchySwaggerMap(valuesType)
->SwaggerObject(Map.empty, additionalProperties = AdditionalPropertiesEnabled(valuesType))
AdditionalPropertiesSwaggerTyped
->AdditionalPropertiesEnabled
AdditionalPropertiesWithoutType
->AdditionalPropertiesEnabled(SwaggerAny)
SwaggerRecursiveSchema/SwaggerUnknownFallback
->SwaggerAny
Other changes
- #3835 Removed Signals and QueryableState. This change affects:
- Configuration
- Components and DeploymentManager API
- REST API
- #3823, #3836, #3843 -
scenarios with multiple sources can be tested from file
TestDataGenerator#generateTestData
returns JSON test records instead of raw bytes. Test records are serialized to a file by designer Test record can optionally contain timestamp which is used to sort records generated by many sourcesTestDataParser
was replaced withTestRecordParser
that turns a single JSON test record into a source recordTestData.newLineSeparated
helper was removed. Scenario test records have to be created explicitly. Each scenario test record has assigned sourceDeploymentManager#test
takesScenarioTestData
instead ofTestData
- Designer configuration
testDataSettings.testDataMaxBytes
renamed totestDataMaxLength
- #3916 Designer configuration
environmentAlert.cssClass
renamed toenvironmentAlert.color
- #3922 Bumps: jwks: 0.19.0 -> 0.21.3, jackson: 2.11.3 -> 2.13.4
- #3929 From now,
SchemaId
value class is used in every place where schema id was represented as an Int. For conversion betweenSchemaId
andInt
useSchemaId.fromInt
andSchemaId.asInt
. UseConfluentUtils.toSchemaWithMetadata
instead ofSchemaWithMetadata.apply
for conversion between Confluent'sSchemaMetadata
and oursSchemaWithMetadata
. - #3948 Now, we are less dependent from Confluent schema registry.
To make it possible, some kafka universal/avro components refactors were done. Most important changes in public API:
- ConfluentSchemaBasedSerdeProvider.universal was replaced by UniversalSchemaBasedSerdeProvider.create
- Non-confluent classes renamed and moved to desired packages
- Extracted new class: SchemaIdFromMessageExtractor to make Confluent logic explicit and moved to top level
- Extracted SchemaValidator to make Confluent logic explicit and be able to compose
- Some renames: ConsumerRecordUtils -> KafkaRecordUtils
- RecordDeserializer -> AvroRecordDeserializer (also inheritance replaced by composition)
- (De)SerializerFactory - easier abstractions
- ConfluentSchemaRegistryFactory is not necessary now - removed
In version 1.7.0
Scenario authoring changes
- #3701 Right now access in SpEL to not existing field on TypedMap won't throw exception, just will return
null
- #3727 Improvements: Change RR Sink validation way:
- Added param
Value validation mode
at RR response component - We no longer support
nullable
param from Everit schema. Nullable schema are supported by union with null e.g. `["null", "string"]
- Added param
Configuration changes
- #3768
request-response-embedded
andstreaming-lite-embedded
DeploymentManager types where replaced by onelite-embedded
DeploymentManager type with two modes:streaming
andrequest-response
like it is done inlite-k8s
case
Code API changes
- #3560, #3595
Remove dependency on
flink-scala
. In particular:- Switched from using
scala.DataStream
todatastream.DataStream
. Some tools exclusive to scala datastreams are available inengine.flink.api.datastream
- Scala based
TypeInformation
derivation is no longer used, for remaining casesflink-scala-utils
module is provided (probably will be removed in the future)
- Switched from using
- #3680
SubprocessRef::outputVariableNames
type is changed fromOption[Map[String,String]]
with default None, toMap[String,String]
with defaultMap.empty
- #3692 Rename
mockedResult
toexternalInvocation
in test results collectors. - #3606 Removed nussknacker-request-response-app. As a replacement you can use:
- nussknacker-request-response-app in version <= 1.6
- Lite K8s engine with request-response processing mode
lite-embedded
Deployment Manager with request-response processing mode
- #3610 Removed deprecated code. For details see changes in pull request.
- #3607 Request-response jsonSchema based encoder:
- ValidationMode moved to package
pl.touk.nussknacker.engine.api.validation
innussknacker-components-api
- BestEffortJsonSchemaEncoder moved to package
pl.touk.nussknacker.engine.json.encode
innussknacker-json-utils
- ValidationMode moved to package
- #3738 Kafka client libraries upgraded to 3.2.3. If using older Flink version, make sure to use 2.8.x client libraries. For Flink versions 1.15.0-1.15.2 include also fixed KafkaMetricWrapper
- #3668 Method
runWithRequests
ofRequestResponseTestScenarioRunner
(returned byTestScenarioRunner.requestResponseBased()
) now returnsValidatedNel
with scenario compilation errors instead of throwing exception in that case
REST API changes
- #3576
/processes
endpoint without query parameters returns all scenarios - the previous behaviour was to return only unarchived ones. To fetch only unarchived scenariosisArchived=false
query parameter has to be passed.
Other changes
- #3824 Due to data serialization fix, Flink scenarios using Kafka sources with schemas may be incompatible and may need to be restarted with clean state.
In version 1.6.0
- #3440 Feature: allow to define fragment's outputs
- Right now using fragments in scenario is changed. We have to provide each outputName for outputs defined in fragment.
Scenario authoring changes
- #3370 Feature: scenario node category verification on validation From now import scenario with nodes from other categories than scenario category will be not allowed.
- #3436 Division by zero will cause validation error. Tests that rely on
1/0
to generate exceptions should have it changed to code like1/{0, 1}[0]
- #3473 JsonRequestResponseSinkFactory provides also 'raw editor', to turn on 'raw editor' add
SinkRawEditorParamName -> "true"
- #3608 Use
ZonedDateTime
fordate-time
JsonSchema format,OffsetTime
fortime
format.
Code API changes
- #3406 Migration from Scalatest 3.0.8 to Scalatest 3.2.10 - if necessary, see the Scalatest migration guides, https://www.scalatest.org/release_notes/3.1.0 and https://www.scalatest.org/release_notes/3.2.0
- #3431 Renamed
helper-utils
todefault-helpers
, separatedMathUtils
fromcomponents-utils
tomath-utils
, removed dependencies fromhelper-utils
- #3420
DeploymentManagerProvider.typeSpecificInitialData
takes deploymentConfigConfig
now - #3493, #3582 Added methods
DeploymentManagerProvider.additionalPropertiesConfig
,DeploymentManagerProvider.additionalValidators
- #3506 Changed
LocalDateTime
toInstant
inOnDeployActionSuccess
inlistener-api
- #3513 Replace
EspProcess
withCanonicalProcess
in all parts of the API except for the compiler. - #3545
TestScenarioRunner.flinkBased
should be used instead ofNuTestScenarioRunner.flinkBased
. Before this, you need toimport pl.touk.nussknacker.engine.flink.util.test.FlinkTestScenarioRunner._
- #3386 Changed
CustomProcessValidator
validate
method. It now receivesCanonicalProcess
instead ofDisplayableProcess
and returnsValidatedNel[ProcessCompilationError, Unit]
instead ofValidationResult
. MovedCustomProcessValidator
from modulenussknacker-restmodel
in packagevalidation
tonussknacker-extensions-api
. - #3586 Module
nussknacker-ui
was renamed tonussknacker-designer
,ui.conf
was renamed todesigner.conf
,defaultUiConfing.conf
renamed todefaultDesignerConfig.conf
REST API changes
- #3506 Dates returned by REST API (createdAt, modifiedAt, createDate) are now returned in Zulu time, with timezone indication. This affects e.g.
/api/procecesses
,/api/processes/{scenarioId}
,/api/processes/{scenarioId}/activity
- #3542 Node additional info path renamed from
nodes/{scenarioId}/additionalData
tonodes/{scenarioId}/additionalInfo
Scenario API changes
- #3471, #3553
RequestResponseMetaData(path)
is changed toRequestResponseMetaData(slug)
.V1_033__RequestResponseUrlToSlug
migration is ready for that, the change also applies to Scenario DSL. - #3513 Scenario DSL returns
CanonicalProcess
instead ofEspProcess
. - #3630
SubprocessOutput
changed toSubprocessUsageOutput
, changes inOutputVar
definition
Configuration changes
- #3425 Deployment Manager for
request-response-embedded
configuration parameters changed:interface
->http.interface
port
->http.port
definitionMetadata
->request-response.definitionMetadata
- #3502 Refactor of
KafkaProperties
:kafkaAddress
property has been deprecated. Please providekafkaProperties."bootstrap.servers"
instead
Other changes
- #3441 Updated Flink 1.14.5 -> 1.15.2. Some Flink artefacts no longer have Scala version. Test using Flink may need to disable checkpointing or reduce time between checkpoints to prevent timeouts or long waits for tasks to finish.
In version 1.5.0
Configuration changes
- #2992 deploySettings changed to deploymentCommentSettings, now when specified require you to also specify field validationPattern, specifying exampleComment is optional.
- commentSettings fields modified. matchExpression changed to substitutionPattern, link changed to substitutionLink.
- #3165 Config is not exposed over http (GET /api/app/config/) by default. To enable it set configuration
enableConfigEndpoint
totrue
. - #3223 OAuth2 configuration
defaultTokenExpirationTime
changed todefaultTokenExpirationDuration
- #3263 Batch periodic scenarios carry processing type to distinguish scenarios with different categories.
For existing scenarios processing type is migrated to
default
. SetdeploymentManager.processingType
todefault
or update periodic scenarios table with actual processing type value - ideally it should be same value as the periodic engine key inscenarioTypes
.
Code API changes
- #2992 OnDeployActionSuccess in ProcessChangeEvent now requires instance of Option[Comment] instead of Option[String] as parameter with deploymentComment information. Added abstract class Comment in listener-api.
- #3136 Improvements: Lite Kafka testkit
ConfluentUtils.serializeRecordToBytesArray
replaced byConfluentUtils.serializeDataToBytesArray
ConfluentUtils.deserializeSchemaIdAndRecord
replaced byConfluentUtils.deserializeSchemaIdAndData
- #3178 Improvements: more complex test scenario runner result:
- Right now each method from
TestScenarioRunner
should returnValidatedNel[ProcessCompilationError, RunResult[R]]
where:- Invalid is representation of process compilation errors
- Valid is representation of positive and negative scenario running result
- Right now each method from
- #3255
TestReporter
util class is safer to use in parallel tests, methods require passing scenario name - #3265 #3288 #3297 #3299#3309
#3316 #3322 #3328 #3330 Changes related with UniversalKafkaSource/Sink:
RuntimeSchemaData
is generic - parametrized byParsedSchema
(AvroSchema and JsonSchema is supported).NkSerializableAvroSchema
renamed toNkSerializableParsedSchema
SchemaWithMetadata
wrapsParsedSchema
instead of AvroSchema
.SchemaRegistryProvider
refactoring:- rename
SchemaRegistryProvider
toSchemaBasedSerdeProvider
- decouple
SchemaRegistryClientFactory
fromSchemaBasedSerdeProvider
- rename
KafkaAvroKeyValueDeserializationSchemaFactory
renamed toKafkaSchemaBasedKeyValueDeserializationSchemaFactory
KafkaAvroValueSerializationSchemaFactory
renamed toKafkaSchemaBasedValueSerializationSchemaFactory
KafkaAvroKeyValueSerializationSchemaFactory
renamed toKafkaSchemaBasedKeyValueSerializationSchemaFactory
- #3253
DeploymentManager
has separatevalidate
method, which should perform initial scenario validation and return reasonably quickly (while deploy can e.g. make Flink savepoint etc.) - #3313 Generic types handling changes:
Typed.typedClass(Class[_], List[TypingResult])
is not available anymore. You should use more explicitTyped.genericTypeClass
instead- We check count of generic parameters in
Typed.genericTypeClass
- wrong number will cause throwing exception now - We populate generic parameters by correct number of
Unknown
in non-generic aware versions ofTyped
factory methods likeTyped.apply
orTyped.typedClass
- #3071 More strict Avro schema validation:
ValidationMode.allowOptional
was removed, instead of it please useValidationMode.lax
ValidationMode.allowRedundantAndOptional
was removed, instead of it please useValidationMode.lax
- Changes of
ValidationMode
, fields:acceptUnfilledOptional
andacceptRedundant
were removed
- #3376
FlinkKafkaSource.flinkSourceFunction
,FlinkKafkaSource.createFlinkSource
andDelayedFlinkKafkaConsumer.apply
takes additional argument,FlinkCustomNodeContext
now - #3272
KafkaZookeeperServer
renamed toEmbeddedKafkaServer
,zooKeeperServer
field changed type toOption
and is hidden now. - #3365 Numerous renames:
- module
nussknacker-avro-components-utils
->nussknacker-schemed-kafka-components-utils
- module
nussknacker-flink-avro-components-utils
->nussknacker-flink-schemed-kafka-components-utils
- package
pl.touk.nussknacker.engine.avro
->pl.touk.nussknacker.engine.schemedkafka
- object
KafkaAvroBaseComponentTransformer
->KafkaUniversalComponentTransformer
- module
- #3412 More strict filtering method types. Methods with parameters or result like
Collection[IllegalType]
are no longer available in SpEl. - #3542 Numerous renames:
- trait
NodeAdditionalInfo
->AdditionalInfo
, - class
MarkdownNodeAdditionalInfo
->MarkdownAdditionalInfo
- trait
NodeAdditionalInfoProvider
->AdditionalInfoProvider
- the SPI provider's configuration files must be renamed frompl.touk.nussknacker.engine.additionalInfo.NodeAdditionalInfoProvider
topl.touk.nussknacker.engine.additionalInfo.AdditionalInfoProvider
- method
AdditionalInfoProvider.additionalInfo
renamed tonodeAdditionalInfo
and new method addedpropertiesAdditionalInfo
- trait
REST API changes
- #3169 API endpoint
/api/app/healthCheck
returning short JSON answer with "OK" status is now not secured - before change it required to be an authenticated user with "read" permission.
Scenario authoring changes
- #3187 #3224 Choice component replaces Switch component. "Default" choice edge type, exprVal and expression are now deprecated. For existing usages, you don't need to change anything. For new usages, if you want extract value e.g. to simplify choice conditions, you need to define new local variable before choice using variable component. "Default" choice edge type can be replaced by adding "true" condition at the end of list of conditions
Breaking changes
- #3328 Due to addition of support for different schema type (AvroSchema and JsonSchema for now) serialization format of
NkSerializableParsedSchema
has changed. Flink state compatibility of scenarios which use Avro sources or sinks has been broken. - #3365 Due to renames (see section
Code API changes
) Flink state compatibility of scenarios which use Avro sources or sinks has been broken.
Other changes
- #3249#3250 Some kafka related libraries were bumped: Confluent 5.5->7.2, avro 1.9->1.11, kafka 2.4 -> 3.2.
It may have influence on your custom components if you depend on
kafka-components-utils
oravro-components-utils
module - #3376 Behavior of Flink's Kafka deserialization errors handling was changed - now instead of job failure, invalid message is omitted and configured
exceptionHandler
mechanism is used.