Migration guide
To see the biggest differences please consult the changelog.
In version 1.18.0
Configuration changes
-
- Button name for 'test adhoc' was renamed from
test-with-form
toadhoc-testing
If you are using custom button config remember to update button type totype: "adhoc-testing"
inprocessToolbarConfig
- Button name for 'test adhoc' was renamed from
-
- Scenario Activity audit log is available
- logger name,
scenario-activity-audit
, it is optional, does not have to be configured - it uses MDC context, example of configuration in
logback.xml
:
<logger name="scenario-activity-audit" level="INFO" additivity="false">
<appender-ref ref="STDOUT_FOR_SCENARIO_ACTIVITY_AUDIT"/>
</logger>
<appender name="STDOUT_FOR_SCENARIO_ACTIVITY_AUDIT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<Pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - [scenarioId=%X{scenarioId}][version=%X{scenarioVersionId}][user=%X{username}] %msg%n</Pattern>
</encoder>
</appender> -
#6979 Add
type: "activities-panel"
to theprocessToolbarConfig
which replaces removed{ type: "versions-panel" }
{ type: "comments-panel" }
and{ type: "attachments-panel" }
Code API changes
- #6971
DeploymentManagerDependencies
API changes:- Added field
scenarioActivityManager: ScenarioActivityManager
scenarioActivityManager
can be used by anyDeploymentManager
to save scenario activities in the Nu database- there is
NoOpScenarioActivityManager
implementation available (if needed for tests etc.)
- Added field
- #6971
DeploymentManager
compatible API changes:DeploymentManager
may now optionally extendManagerSpecificScenarioActivitiesStoredByManager
- managers extending that trait may internally handle some manager-specific ScenarioActivities and may be queried about those custom activities
- #6695
SingleTypingResult
API changes:- Renamed
objType
toruntimeObjType
which indicates a current object in a runtime.
- Renamed
- #6766
- Process API changes:
- Field
ScenarioWithDetails.labels
was added - Field
ScenarioWithDetails.tags
was removed (it had the same value aslabels
and was not used)
- Field
- Process API changes:
- #6988 Removed unused API classes:
MultiMap
,TimestampedEvictableStateFunction
.MultiMap
was incorrectly handled by Flink's default Kryo serializer, so if you want to copy it to your code you should write and register a proper serializer.
REST API changes
- #6944
- New endpoint
/api/scenarioTesting/{scenarioName}/adhoc/validate
- New endpoint
- #6766
- Process API changes:
- PUT
/api/processes/{processName}
- optionalscenarioLabels
field added
- PUT
- Migration API changes:
- POST
/api/migrate
supports v2 request format (withscenarioLabels
field)
- POST
- Process API changes:
- #7021
- Definitions API changes:
- GET
/api/processDefinitionData/*}
- added optional query param
enrichedWithUiConfig
- added
requiredParam
property to the response for parameter config atcomponents['component-id'].parameters[*]
- added optional query param
- GET
- Definitions API changes:
Configuration changes
- #6958 Added message size limit in the "Kafka" exceptionHandler:
maxMessageBytes
. Its default value reflects Kafka's default size limit of 1 MB (max.message.bytes
), you need to increase it if your error topic allows for larger messages. Remember to add some margin for Kafka protocol overhead (100 bytes should be enough).
Other changes
-
#6692 Kryo serializers for
UnmodifiableCollection
,scala.Product
etc. are registered based on class of Serializer instead of instance of Serializer. If you have values that were serialized by these Serializers in some state, the state won't be restored after upgrade. -
#6952 Improvement: TypeInformation support for scala.Option: If you used CaseClassTypeInfoFactory with case classes that contain the Option type, the state won't be restored after the upgrade.
-
#6805 Updated Flink 1.18.1 -> 1.19.1. Due to backwards incompatible changes in this Flink version update, Nussknacker 1.18 will not work with Flink versions pre-1.19 right away. If you want to keep using Flink pre-1.19 with current Nussknacker, please refer to compatibility providing plugins in https://github.com/TouK/nussknacker-flink-compatibility.
-
#6912 Improvement: Make TimeMeasuringService usable with other Lifecycle traits
- Services that use
TimeMeasuringService
must be rebuilt
- Services that use
-
Performance optimization:
- #7058 Add missing Flink TypeInformation for better serialization
- In case of using base (bounded and unbounded) Flink components state will be probably not compatible
FlinkCustomNodeContext.typeInformationDetection
has been removed, please useTypeInformationDetection.instance
insteadFlinkCustomNodeContext.forCustomContext
has been removed, please useTypeInformationDetection.instance.forValueWithContext
instead
- #7097 Flink base types registration mechanism
- In case of using types: java.time.LocalDate, java.time.LocalTime, java.time.LocalDateTime with CaseClassTypeInfo mechanism, state probably will be lost
- #7058 Add missing Flink TypeInformation for better serialization
-
#7113 Scala 2.13 was updated to 2.13.15, you should update your
flink-scala-2.13
to 1.1.2 -
#7187 JSON decoding in
request
source (request-response processing mode) and inkafka
source (streaming processing mode): For small decimal numbers is used eitherInteger
orLong
(depending on number size) instead ofBigDecimal
. This change should be transparent in most cases as this value was mostly used after#CONV.toNumber()
invocation which still will return aNumber
.
In version 1.17.0
Code API changes
- #6248 Removed implicit conversion from string to SpeL
expression (
pl.touk.nussknacker.engine.spel.Implicits
). The conversion should be replaced bypl.touk.nussknacker.engine.spel.SpelExtension.SpelExpresion.spel
. - #6282 If you relied on the default value of the
topicsExistenceValidationConfig.enabled
setting, you must now be aware that topics will be validated by default (Kafka'sauto.create.topics.enable
setting is only considered in case of Sinks). Create proper topics manually if needed. - Component's API changes
- #6711
SingleComponentConfig
changed toComponentConfig
for better domain naming. Associated functions and objects also changed to...ComponentConfig...
. - #6418 Improvement: Pass implicit nodeId to
EagerServiceWithStaticParameters.returnType
Now methodreturnType
fromEagerServiceWithStaticParameters
requires implicit nodeId param - #6462
CustomStreamTransformer.canHaveManyInputs
field was removed. You don't need to implement any other method in replacement, just remove this method. - #6418 Improvement: Pass implicit nodeId to
EagerServiceWithStaticParameters.returnType
- Now method
returnType
fromEagerServiceWithStaticParameters
requires implicit nodeId param
- Now method
- #6340
TestRecordParser
trait used inSourceTestSupport
trait changed to work on lists instead of single records - itsparse
method now takesList[TestRecord]
instead of a singleTestRecord
and returns a list of results instead of a single result. - #6520
ExplicitTypeInformationSource
trait was removed - nowTypeInformation
produced bySourceFunction
passed toStreamExecutionEnvironment.addSource
is detected based onTypingResult
(thanks toTypeInformationDetection
)BlockingQueueSource.create
takesClassTag
implicit parameter instead ofTypeInformation
EmitWatermarkAfterEachElementCollectionSource.create
takesClassTag
implicit parameter instead ofTypeInformation
CollectionSource
'sTypeInformation
implicit parameter was removedEmptySource
'sTypeInformation
implicit parameter was removed
- #6545
FlinkSink.prepareTestValue
was replaced byprepareTestValueFunction
- a non-parameter method returning a function. Thanks to that,FlinkSink
is not serialized during test data preparation.
- #6711
TypingResult
API changes- #6436 Changes to
TypingResult
of SpEL expressions that are maps or lists:TypedObjectTypingResult.valueOpt
now returns ajava.util.Map
instead ofscala.collection.immutable.Map
- NOTE: selection (
.?
) or operations from the#COLLECTIONS
helper cause the map to lose track of its keys/values, reverting itsfields
to an empty Map
- NOTE: selection (
- SpEL list expression are now typed as
TypedObjectWithValue
, with theunderlying
TypedClass
equal to theTypedClass
before this change, and withvalue
equal to ajava.util.List
of the elements' values.- NOTE: selection (
.?
), projection (.!
) or operations from the#COLLECTIONS
helper cause the list to lose track of its values, reverting it to a value-lessTypedClass
like before the change
- NOTE: selection (
- #6566
TypedObjectTypingResult.fields
are backed byListMap
for correctRowTypeInfo
's fields order purpose. If #5457 migrations were applied, it should be a transparent change- Removed deprecated
TypedObjectTypingResult.apply
methods - should be usedTyped.record
factory method Typed.record
factory method takesIterable
instead ofMap
- Removed deprecated
- #6570
TypingResult.canBeSubclassOf
generic parameter checking related changes. Generic parameters ofTyped[java.util.Map[X, Y]]
,Typed[java.util.List[X]]
,Typed[Array[X]]
were checked as they were either covariant or contravariant. Now they are checked more strictly - depending on collection characteristic.Key
parameters ofTyped[java.util.Map[Key, Value]]
is treated as invariantValue
parameters ofTyped[java.util.Map[Key, Value]]
is treated as covariantElement
parameters ofTyped[java.util.List[Element]]
is treated as covariantElement
parameters ofTyped[Array[Element]]
is treated as covariant
- #6436 Changes to
- #6503
FlinkTestScenarioRunner
cleanupsrunWithDataAndTimestampAssigner
method was removed. Instead,timestampAssigner
was added as an optional parameter intorunWithData
- new
runWithDataWithType
was added allowing to test using other types than classes e.g. records
- #6567 Removed ability to set Flink's execution mode
in sources:
TableSource
,CollectionSource
and inFlinkTestScenarioRunner.runWithData
method. Now you can configure it undermodelConfig.executionMode
or for test purposes throughFlinkTestScenarioRunnerBuilder.withExecutionMode
method. - #6610 Add flink node context as parameter to BasicFlinkSink.
Now one can use
FlinkCustomNodeContext
in order to build sink inBasicFlinkSink#toFlinkFunction
method. - #6635 #6643
TypingResultTypeInformation
related changesTypingResultAwareTypeInformationCustomisation
API was removedFlinkCustomNodeContext.typeInformationDetection
is deprecated - useTypeInformationDetection.instance
insteadFlinkCustomNodeContext.valueWithContextInfo.forCustomContext
is is deprecated - useTypeInformationDetection.instance.forValueWithContext
instead
- #6640
BestEffort*Encoder
naming changes:- All
BestEffort*Encoder
classes renamed to fitTo<TargetFormat>(SchemaBased)Encoder
naming schema JsonToNuStruct
renamed toFromJsonDecoder
(to fitFrom<SourceFormat>Decoder
naming schema)ToJsonEncoder
renamed toToJsonEncoderCustomisation
ToJsonBasedOnSchemaEncoder
renamed toToJsonSchemaBasedEncoderCustomisation
- All
- #6586 For now on, the SQL enricher automatically converts types as shown below:
- java.sql.Array -> java.util.List
- java.sql.Time -> java.time.LocalTime
- java.sql.Date -> java.time.LocalDate
- java.sql.Timestamp -> java.time.Instant
- java.sql.Clob -> java.lang.String