Release Notes

New features

FeatureDescription
Application composition projectsIBM Financial Services Workbench 3.0 lets you turn microservices into components and use them as components in multiple application composition projects.
New pipeline configurationsVersion 3.0 comes with two new pipeline configurations, one for publishing a service project to the new component repository and one for deploying a single microservice to a deployment target.
Component repositoryThere is now a component repository that acts as a catalog of your business capabilities by providing all published microservices as components to be added to application composition projects.
GitOps mechanism fo applicationsIBM Financial Services Workbench 3.0 introduces a new deployment mechanism for application composition projects that follows GitOps principles.
WorkspacesWorkspaces allow you to focus your work by adding only the projects you are interested in to a dedicated workspace. You can create as many workspaces as you need and you can add as many projects to each workspace as you want.

Known issues

TopicDescription
Update of pom.xml fileCurrently the root level pom.xml file will not be updated automatically in case the Java SDK version has changed. This has to be done manually in the Git repository as described here.
API namespace creationCurrently it is only supported to create an API namespace from scratch or by uploading an API specification. The other possibility (clone) will be introduced in a later release and is disabled in the current version.
Fetching permissionsWhen getting new permissions or when permissions are changed in repository membership or in role mapping, it may take up to 3 minutes (max.) for them to get activated in the case of repository membership being changed. If the role mappings are changed then you should log out and log in again to activate the permissions. At the moment this time period is not configurable. In order to shorten the time needed to activate the new permissions, you can clear the cache manually by calling the corresponding API, e.g. through the Swagger UI interface.
Discriminator property for "one of"-schemasWhen creating a "oneOf-schema" and setting a discriminator, a required property is created with the discriminator name and it is automatically attached to each schema that belongs in the "oneOf-schema". Therefore, if any of these schemas is used independently, then the discriminator property must be set accordingly to #/components/schemas/<schema_localIdentifier>.
Defining the discriminator propertyIf any of the schemas of a "oneOf-schema" contains a property with the same name as the discriminator name, then this property will be replaced with the discriminator. The discriminator is an Enum value of type String. Therefore, make sure there are no properties in any of the schemas of a "oneOf-schema" having the same name as the discriminator.
Working with two developers on one projectIn case of conflicts, delete the package-lock.json of the project and execute fss pull.
GitHub Enterprise integrationThe initial commit shows wrong user data in the Git repository.
Local marketplaceLocal Marketplace feature is not available on fresh installation without additional manual steps.
Existing bindingsExisting default or custom bindings in runtime namespaces still work but are not shown via the Configuration Management REST API.
Environment level supports only one message hub bindingCurrently it is not possible to configure more than one Message Hub Service Binding (Kafka binding) on the Environment level in Solution Hub. But you can still configure and use multiple Message Hub Service Bindings on the Project level.
Local debugging of TypeScript low-code projects with eventsLocal debugging of low-code projects based on TypeScript that include events requires some additional manual configuration in order to connect to the Kafka cluster.
Usage of arrays/complex entities as event payloadSending an array / a complex payload entity (i.e. has properties that are entities) is not supported due to the way an entity is represented (nested value dictionaries) that are not able to be constructed back when consuming events and firing agents. If the event payload is a simple structure (an entity with simple properties), this gets constructed back successfully, the event will be consumed and the agent is fired.
CPD import/export functionality not currently supportedCurrently the Cloud Pak For Data Import/Export framework is not supported. There are some product specific APIs as well as using native tooling for the product data which is primarily stored in the customers own git repositories
Domain agentsDomain Agents that are bound to an Event can only be executed if the Testing Support is enabled in the Project Configuration (please see section 'Testing - Enable Testing Support').
IntegrationAPI dependency Local Lookup, using an Open API 3.0 specification generated by a Java Low-Code project will not work as a Local Lookup API dependency in a Typescript Low-Code project. This is due to differences in the Java generated specification (operation paths) and expected API Binding URL.
Configure ComponentIf a component in an application composition has a custom configuration, it can no longer be completely deleted. If the corresponding input field is empty, the Save button is disabled. As a workaround, a line with a comment can be added to the custom configuration.
Swagger-UI/Docs of Java pro code servicesThe BaseUrl in generated Swagger Docs of Java Pro Code API's is incorrect, therefore calling a Java Pro Code Service API does not work using the Swagger UI. As a workaround, the path /{applicationAcronym}/{serviceAcronym} needs to be added to the baseUrl if a Java Pro Code API is called. In the case of Single Service Deployment, only the /{serviceAcronym} path needs to be added. For example, if the generated url of an API is https://dev-stage.apps.openshift/api/v1/customer it should look like https://dev-stage.apps.openshift/myapp/mysvc/api/v1/customer.