HTTP API
The HTTP API intends to expose most Talend Component Kit features over HTTP. It is a standalone Java HTTP server.
The WebSocket protocol is activated for the endpoints. Endpoints then use /websocket/v1 as base instead of /api/v1 . See WebSocket for more details.
|
The API description is browsable using OpenAPI browser.
To make sure that the migration can be enabled, you need to set the version the component was created with in the execution configuration that you send to the server (component version is in component the detail endpoint). To do that, use tcomp::component::version key.
|
Deprecated endpoints
Endpoints that are intended to disappear will be deprecated. A X-Talend-Warning
header will be returned with a message as value.
WebSocket transport
You can connect yo any endpoint by:
-
Replacing
/api
with/websocket
-
Appending
/<http method>
to the URL -
Formatting the request as:
SEND
destination: <endpoint after v1>
<headers>
<payload>^@
For example:
SEND
destination: /component/index
Accept: application/json
^@
The response is formatted as follows:
MESSAGE
status: <http status code>
<headers>
<payload>^@
All endpoints are logged at startup. You can then find them in the logs if you have a doubt about which one to use. |
If you don’t want to create a pool of connections per endpoint/verb, you can use the bus endpoint: /websocket/v1/bus
.
This endpoint requires that you add the destinationMethod
header to each request with the verb value (GET
by default):
SEND
destination: /component/index
destinationMethod: GET
Accept: application/json
^@
Server configuration
the configuration is read from system properties, environment variables, …. |
- talend.component.server.component.coordinates
-
A comma separated list of gav to locate the components
- talend.component.server.component.extend.dependencies
-
Default value:
true
. Should the component extensions add required dependencies. - talend.component.server.component.registry
-
A property file where the value is a gav of a component to register (complementary with
coordinates
) - talend.component.server.documentation.active
-
Default value:
true
. Should the /documentation endpoint be activated. - talend.component.server.execution.dataset.retriever.timeout
-
Default value:
180
. How long the read execution endpoint can last (max) - talend.component.server.execution.pool.wait
-
Default value:
PT10S
. How long the application waits during shutdown for the execution tasks to complete - talend.component.server.jaxrs.exceptionhandler.defaultMessage
-
Default value:
false
. If set it will replace any message for exceptions. Set tofalse
to use the actual exception message. - talend.component.server.maven.repository
-
The local maven repository used to locate components and their dependencies
- talend.component.server.metrics.active
-
Default value:
true
. Should the /api/v1/metrics endpoint be activated. - talend.component.server.monitoring.brave.service.name
-
Default value:
component-server
. The name used by the brave integration (zipkin) - talend.component.server.security.command.handler
-
Default value:
securityNoopHandler
. How to validate a command/request. Accepted values: securityNoopHandler. - talend.component.server.security.connection.handler
-
Default value:
securityNoopHandler
. How to validate a connection. Accepted values: securityNoopHandler.
HTTPS activation
Using the server ZIP (or Docker image), you can configure HTTPS by adding properties to MEECROWAVE_OPTS
. Assuming that you have a certificate in /opt/certificates/component.p12
(don’t forget to add/mount it in the Docker image if you use it), you can activate it as follows:
# use -e for Docker
#
# this skips the http port binding and only binds https on the port 8443, and setups the correct certificate
export MEECROWAVE_OPTS="-Dskip-http=true -Dssl=true -Dhttps=8443 -Dkeystore-type=PKCS12 -Dkeystore-alias=talend -Dkeystore-password=talend -Dkeystore-file=/opt/certificates/component.p12"
Web forms and REST API
The component-form
library provides a way to build a component REST API facade that is compatible with React form library.
for example:
@Path("tacokit-facade")
@ApplicationScoped
public class ComponentFacade {
private static final String[] EMPTY_ARRAY = new String[0];
@Inject
private Client client;
@Inject
private ActionService actionService;
@Inject
private UiSpecService uiSpecService;
@Inject // assuming it is available in your app, use any client you want
private WebTarget target;
@POST
@Path("action")
public void action(@Suspended final AsyncResponse response, @QueryParam("family") final String family,
@QueryParam("type") final String type, @QueryParam("action") final String action,
final Map<String, Object> params) {
client.action(family, type, action, params).handle((r, e) -> {
if (e != null) {
onException(response, e);
} else {
response.resume(actionService.map(type, r));
}
return null;
});
}
@GET
@Path("index")
public void getIndex(@Suspended final AsyncResponse response,
@QueryParam("language") @DefaultValue("en") final String language) {
target
.path("component/index")
.queryParam("language", language)
.request(APPLICATION_JSON_TYPE)
.rx()
.get(ComponentIndices.class)
.toCompletableFuture()
.handle((index, e) -> {
if (e != null) {
onException(response, e);
} else {
index.getComponents().stream().flatMap(c -> c.getLinks().stream()).forEach(
link -> link.setPath(link.getPath().replaceFirst("/component/", "/application/").replace(
"/details?identifiers=", "/detail/")));
response.resume(index);
}
return null;
});
}
@GET
@Path("detail/{id}")
public void getDetail(@Suspended final AsyncResponse response,
@QueryParam("language") @DefaultValue("en") final String language, @PathParam("id") final String id) {
target
.path("component/details")
.queryParam("language", language)
.queryParam("identifiers", id)
.request(APPLICATION_JSON_TYPE)
.rx()
.get(ComponentDetailList.class)
.toCompletableFuture()
.thenCompose(result -> uiSpecService.convert(result.getDetails().iterator().next()))
.handle((result, e) -> {
if (e != null) {
onException(response, e);
} else {
response.resume(result);
}
return null;
});
}
private void onException(final AsyncResponse response, final Throwable e) {
final UiActionResult payload;
final int status;
if (WebException.class.isInstance(e)) {
final WebException we = WebException.class.cast(e);
status = we.getStatus();
payload = actionService.map(we);
} else if (CompletionException.class.isInstance(e)) {
final CompletionException actualException = CompletionException.class.cast(e);
log.error(actualException.getMessage(), actualException);
status = Response.Status.BAD_GATEWAY.getStatusCode();
payload = actionService.map(new WebException(actualException, -1, emptyMap()));
} else {
log.error(e.getMessage(), e);
status = Response.Status.BAD_GATEWAY.getStatusCode();
payload = actionService.map(new WebException(e, -1, emptyMap()));
}
response.resume(new WebApplicationException(Response.status(status).entity(payload).build()));
}
}
the Client can be created using ClientFactory.createDefault(System.getProperty("app.components.base", "http://localhost:8080/api/v1")) and the service can be a simple new UiSpecService<>() . The factory uses JAX-RS if the API is available (assuming a JSON-B provider is registered). Otherwise, it tries to use Spring.
|
The conversion from the component model (REST API) to the uiSpec model is done through UiSpecService
. It is based on the object model which is mapped to a UI model. Having a flat model in the component REST API allows to customize layers easily.
You can completely control the available components, tune the rendering by switching the uiSchema
, and add or remove parts of the form.
You can also add custom actions and buttons for specific needs of the application.
TheĀ /migrate endpoint was not shown in the previous snippet but if you need it, add it as well.
|
Using the UiSpec model without the tooling
<dependency>
<groupId>org.talend.sdk.component</groupId>
<artifactId>component-form-model</artifactId>
<version>${talend-component-kit.version}</version>
</dependency>
This Maven dependency provides the UISpec model classes. You can use the Ui
API (with or without the builders) to create UiSpec representations.
For Example:
final Ui form1 = ui()
.withJsonSchema(JsonSchema.jsonSchemaFrom(Form1.class).build()) (1)
.withUiSchema(uiSchema() (2)
.withKey("multiSelectTag")
.withRestricted(false)
.withTitle("Simple multiSelectTag")
.withDescription("This data list accepts values that are not in the list of suggestions")
.withWidget("multiSelectTag")
.build())
.withProperties(myFormInstance) (3)
.build();
final String json = jsonb.toJson(form1); (4)
1 | The JsonSchema is extracted from reflection on the Form1 class. Note that @JsonSchemaIgnore allows to ignore a field and @JsonSchemaProperty allows to rename a property. |
2 | A UiSchema is programmatically built using the builder API. |
3 | An instance of the form is passed to let the serializer extract its JSON model. |
4 | The Ui model, which can be used by UiSpec compatible front widgets, is serialized. |
The model uses the JSON-B API to define the binding. Make sure to have an implementation in your classpath. To do that, add the following dependencies:
<dependency>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-jsonb_1.0_spec</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-json_1.1_spec</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>org.apache.johnzon</groupId>
<artifactId>johnzon-jsonb</artifactId>
<version>${johnzon.version}</version> <!-- 1.1.5 for instance -->
</dependency>
JavaScript integration
component-kit.js is no more available (previous versions stay on NPM) since it got replaced by @talend/react-containers .
The previous import can be replaced by import kit from '@talend/react-containers/lib/ComponentForm/kit'; .
|
Default JavaScript integration goes through the Talend UI Forms library and its Containers wrapper.
Documentation is now available on the previous link.
Logging
The logging uses Log4j2. You can specify a custom configuration by using the -Dlog4j.configurationFile
system property or by adding a log4j2.xml
file to the classpath.
Here are some common configurations:
-
Console logging:
<?xml version="1.0"?>
<Configuration status="INFO">
<Appenders>
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="[%d{HH:mm:ss.SSS}][%highlight{%-5level}][%15.15t][%30.30logger] %msg%n"/>
</Console>
</Appenders>
<Loggers>
<Root level="INFO">
<AppenderRef ref="Console"/>
</Root>
</Loggers>
</Configuration>
Output messages look like:
[16:59:58.198][INFO ][ main][oyote.http11.Http11NioProtocol] Initializing ProtocolHandler ["http-nio-34763"]
-
JSON logging:
<?xml version="1.0"?>
<Configuration status="INFO">
<Properties>
<!-- DO NOT PUT logSource there, it is useless and slow -->
<Property name="jsonLayout">{"severity":"%level","logMessage":"%encode{%message}{JSON}","logTimestamp":"%d{ISO8601}{UTC}","eventUUID":"%uuid{RANDOM}","@version":"1","logger.name":"%encode{%logger}{JSON}","host.name":"${hostName}","threadName":"%encode{%thread}{JSON}","stackTrace":"%encode{%xThrowable{full}}{JSON}"}%n</Property>
</Properties>
<Appenders>
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="${jsonLayout}"/>
</Console>
</Appenders>
<Loggers>
<Root level="INFO">
<AppenderRef ref="Console"/>
</Root>
</Loggers>
</Configuration>
Output messages look like:
{"severity":"INFO","logMessage":"Initializing ProtocolHandler [\"http-nio-46421\"]","logTimestamp":"2017-11-20T16:04:01,763","eventUUID":"8b998e17-7045-461c-8acb-c43f21d995ff","@version":"1","logger.name":"org.apache.coyote.http11.Http11NioProtocol","host.name":"TLND-RMANNIBUCAU","threadName":"main","stackTrace":""}
-
Rolling file appender:
<?xml version="1.0"?>
<Configuration status="INFO">
<Appenders>
<RollingRandomAccessFile name="File" fileName="${LOG_PATH}/application.log" filePattern="${LOG_PATH}/application-%d{yyyy-MM-dd}.log">
<PatternLayout pattern="[%d{HH:mm:ss.SSS}][%highlight{%-5level}][%15.15t][%30.30logger] %msg%n"/>
<Policies>
<SizeBasedTriggeringPolicy size="100 MB" />
<TimeBasedTriggeringPolicy interval="1" modulate="true"/>
</Policies>
</RollingRandomAccessFile>
</Appenders>
<Loggers>
<Root level="INFO">
<AppenderRef ref="File"/>
</Root>
</Loggers>
</Configuration>
More details are available in the RollingFileAppender documentation.
You can compose previous layout (message format) and appenders (where logs are written). |
Docker
The server image is deployed on Docker. Its version is suffixed with a timestamp to ensure images are not overriden and can break your usage. You can check the available version on Docker hub.
Run
You can run the docker image by executing this command :
$ sudo docker run -p 8080:8080 tacokit/component-starter
Configure
You can set the env variable MEECROWAVE_OPTS
to customize the server, by default it is installed in /opt/talend/component-kit
.
Maven repository
The maven repository is the default one of the machine, you can change it setting the system property
talend_component_server_maven_repository=/path/to/your/m2
.
Deploy components to the server
If you want to deploy some components you can configure which ones in MEECROWAVE_OPTS (see server doc online) and redirect your local m2:
$ docker run \
-p 8080:8080 \
-v ~/.m2:/root/.m2 \
-e MEECROWAVE_OPTS="-Dtalend.component.server.component.coordinates=g:a:v,g2:a2:v2,..." \
component-server
Logging
The component server docker image comes with two log4j2 profile default
and kafka
.
The logging profile can be changed by setting the environment variable TALEND_COMPONENT_LOG4J2_PROFILE
to kafka
the default
profile is active by default.
default profile
The default profile has file and console logging capabilities.
The console logging is off by default and you can activate it by setting CONSOLE_LOG_LEVEL
environment variable
to DEBUG
, INFO
, WARN
or any other log level supported by log4j2. In practise and during development you will want
to see the logs without connecting to the server by activating console logging.
Run docker image with console logging
sudo docker run -p 8080:8080 \
-e CONSOLE_LOG_LEVEL=INFO \
component-server
Kafka profile
Kafka profile let you send log to Kafka servers. The log are formatted in json and follow the layout defined by Talend and described here github.com/Talend/daikon/tree/master/daikon-logging/logging-event-layout
This profile require two environment variables
-
LOG_KAFKA_TOPIC
: Kafka topic. -
LOG_KAFKA_URL
: A list of host/port pairs to use for establishing the initial connection to the Kafka cluster. This list should be in the formurl:port
separated by,
Run docker image with kafka profile
sudo docker run -p 8080:8080 \
-e TALEND_COMPONENT_LOG4J2_PROFILE=kafka \
-e LOG_KAFKA_URL=`log kafka url:port` \
-e LOG_KAFKA_TOPIC=`log kafka topic` \
-e TRACING_KAFKA_URL=`tracing kafka url:port` \
-e TRACING_KAFKA_TOPIC=`tracing kafka topic` \
tacokit/component-server
Note : LOG_KAFKA_TOPIC
will receive the application and the access logs
and TRACING_KAFKA_TOPIC
will receive brave tracing logs.
OpenTracing
The component server use Geronimo OpenTracing to monitor request.
The tracing can be activated by setting the TRACING_ON
environment variable to true
.
You can choose the reporter type by setting talend_component_server_monitoring_brave_reporter_type
environment variable
to log
(this is the default value in this docker image) or to noop
which will deactivate the tracing. Other type of reporter may be added in the future.
The tracing rate is configurable by setting the TRACING_SAMPLING_RATE
environment variable.
It accepts 0
(none) and 1
(all, default) as values to ensure the consistency of the reporting.
Run docker image with tracing on.
sudo docker run -p 8080:8080 \
-e TRACING_ON=true \
tacokit/component-server
By default, Geronimo OpenTracing will log the spans in a Zipking format so you can use the Kafka profile as explained before to wire it over any OpenTracing backend.
Build the image yourself
You can build component starter server in docker following those instructions :
docker build --build-arg ARTIFACT_ID=component-starter-server \
--build-arg SERVER_VERSION=`component starter server version` \
--tag tacokit/component-server .
this assumes the project is built before you run that command. |