Observability - Java SDK
The observability section of the Temporal Developer's guide covers the many ways to view the current state of your Temporal Application—that is, ways to view which Workflow Executions are tracked by the Temporal Platform and the state of any specified Workflow Execution, either currently or at points of an execution.
This section covers features related to viewing the state of the application, including:
Emit metrics
Each Temporal SDK is capable of emitting an optional set of metrics from either the Client or the Worker process. For a complete list of metrics capable of being emitted, see the SDK metrics reference.
Metrics can be scraped and stored in time series databases, such as:
Temporal also provides a dashboard you can integrate with graphing services like Grafana. For more information, see:
- Temporal's implementation of the Grafana dashboard
- How to export metrics in Grafana
To emit metrics with the Java SDK, use theMicrometerClientStatsReporter
class to integrate with Micrometer MeterRegistry configured for your metrics backend.
Micrometer is a popular Java framework that provides integration with Prometheus and other backends.
The following example shows how to use MicrometerClientStatsReporter
to define the metrics scope and set it with the WorkflowServiceStubsOptions
.
//...
// see the Micrometer documentation for configuration details on other supported monitoring systems.
// in this example shows how to set up Prometheus registry and stats reported.
PrometheusMeterRegistry registry = new PrometheusMeterRegistry(PrometheusConfig.DEFAULT);
StatsReporter reporter = new MicrometerClientStatsReporter(registry);
// set up a new scope, report every 10 seconds
Scope scope = new RootScopeBuilder()
.reporter(reporter)
.reportEvery(com.uber.m3.util.Duration.ofSeconds(10));
// for Prometheus collection, expose a scrape endpoint.
//...
// add metrics scope to WorkflowServiceStub options
WorkflowServiceStubsOptions stubOptions =
WorkflowServiceStubsOptions.newBuilder().setMetricsScope(scope).build();
//...
For more details, see the Java SDK Samples. For details on configuring a Prometheus scrape endpoint with Micrometer, see the Micrometer Prometheus Configuring documentation.
Set up tracing
Tracing allows you to view the call graph of a Workflow along with its Activities and any Child Workflows.
Temporal Web's tracing capabilities mainly track Activity Execution within a Temporal context. If you need custom tracing specific for your use case, you should make use of context propagation to add tracing logic accordingly.
To configure tracing in Java, register the OpenTracingClientInterceptor()
interceptor.
You can register the interceptors on both the Temporal Client side and the Worker side.
The following code examples demonstrate the OpenTracingClientInterceptor()
on the Temporal Client.
WorkflowClientOptions.newBuilder()
//...
.setInterceptors(new OpenTracingClientInterceptor())
.build();
WorkflowClientOptions clientOptions =
WorkflowClientOptions.newBuilder()
.setInterceptors(new OpenTracingClientInterceptor(JaegerUtils.getJaegerOptions(type)))
.build();
WorkflowClient client = WorkflowClient.newInstance(service, clientOptions);
The following code examples demonstrate the OpenTracingClientInterceptor()
on the Worker.
WorkerFactoryOptions.newBuilder()
//...
.setWorkerInterceptors(new OpenTracingWorkerInterceptor())
.build();
WorkerFactoryOptions factoryOptions =
WorkerFactoryOptions.newBuilder()
.setWorkerInterceptors(
new OpenTracingWorkerInterceptor(JaegerUtils.getJaegerOptions(type)))
.build();
WorkerFactory factory = WorkerFactory.newInstance(client, factoryOptions);
For more information, see the Temporal OpenTracing module.
Log from a Workflow
Logging enables you to record critical information during code execution. Loggers create an audit trail and capture information about your Workflow's operation. An appropriate logging level depends on your specific needs. During development or troubleshooting, you might use debug or even trace. In production, you might use info or warn to avoid excessive log volume.
The logger supports the following logging levels:
Level | Use |
---|---|
TRACE | The most detailed level of logging, used for very fine-grained information. |
DEBUG | Detailed information, typically useful for debugging purposes. |
INFO | General information about the application's operation. |
WARN | Indicates potentially harmful situations or minor issues that don't prevent the application from working. |
ERROR | Indicates error conditions that might still allow the application to continue running. |
The Temporal SDK core normally uses WARN
as its default logging level.
To get a standard slf4j
logger in your Workflow code, use the Workflow.getLogger
method.
private static final Logger logger = Workflow.getLogger(DynamicDslWorkflow.class);
Logs in replay mode are omitted unless the WorkerFactoryOptions.Builder.setEnableLoggingInReplay(boolean)
method is set to true.
How to provide a custom logger
Use a custom logger for logging.
To set a custom logger, supply your own logging implementation and configuration details the same way you would in any other Java application.
Visibility APIs
The term Visibility, within the Temporal Platform, refers to the subsystems and APIs that enable an operator to view Workflow Executions that currently exist within a Temporal Service.
How to use Search Attributes
The typical method of retrieving a Workflow Execution is by its Workflow Id.
However, sometimes you'll want to retrieve one or more Workflow Executions based on another property. For example, imagine you want to get all Workflow Executions of a certain type that have failed within a time range, so that you can start new ones with the same arguments.
You can do this with Search Attributes.
- Default Search Attributes like
WorkflowType
,StartTime
andExecutionStatus
are automatically added to Workflow Executions. - Custom Search Attributes can contain their own domain-specific data (like
customerId
ornumItems
).- A few generic Custom Search Attributes like
CustomKeywordField
andCustomIntField
are created by default in Temporal's Docker Compose.
- A few generic Custom Search Attributes like
The steps to using custom Search Attributes are:
- Create a new Search Attribute in your Temporal Service using
temporal operator search-attribute create
or the Cloud UI. - Set the value of the Search Attribute for a Workflow Execution:
- On the Client by including it as an option when starting the Execution.
- In the Workflow by calling
upsertTypedSearchAttributes
.
- Read the value of the Search Attribute:
- On the Client by calling
DescribeWorkflow
. - In the Workflow by looking at
WorkflowInfo
.
- On the Client by calling
- Query Workflow Executions by the Search Attribute using a List Filter:
- In the Temporal CLI.
- In code by calling
ListWorkflowExecutions
.
How to set custom Search Attributes
After you've created custom Search Attributes in your Temporal Service (using temporal operator search-attribute create
or the Cloud UI), you can set the values of the custom Search Attributes when starting a Workflow.
When starting a Workflow Execution with your Client, include the Custom Search Attribute in the options using WorkflowOptions.newBuilder().setTypedSearchAttributes()
:
// In a shared constants file, so all files have access
public static final SearchAttributeKey<Boolean> IS_ORDER_FAILED = SearchAttributeKey.forBoolean("isOrderFailed");
...
// In main
WorkflowOptions options = WorkflowOptions.newBuilder()
.setWorkflowId(workflowID)
.setTaskQueue(Constants.TASK_QUEUE_NAME)
.setTypedSearchAttributes(generateSearchAttributes())
.build();
PizzaWorkflow workflow = client.newWorkflowStub(PizzaWorkflow.class, options);
...
// Further down in the file
private static Map<String, Object> generateSearchAttributes(){
return SearchAttributes.newBuilder().set(Constants.IS_ORDER_FAILED, false).build();
}
Each SearchAttribute
object represents a custom attribute name, and the value is a SearchAttributeKey
representing a specific type. Currently the following types are supported:
- Boolean
- Double
- Long
- KeyWord
- KeyWordList
- Text
In this example isOrderFailed
is set as a Search Attribute. This attribute is
useful for querying Workflows based the success/failure of customer orders.
How to upsert Search Attributes
Within the Workflow code, you can dynamically add or update Search Attributes using upsertTypedSearchAttributes
.
This method is particularly useful for Workflows whose attributes need to change based on internal logic or external events.
import io.temporal.workflow.Workflow;
...
// Existing Workflow Logic
Map<String, Object> searchAttribute = new HashMap<>();
Distance distance;
try {
distance = activities.getDistance(address);
searchAttribute.put("isOrderFailed", false);
Workflow.upsertTypedSearchAttributes(Constants.IS_ORDER_FAILED.valueSet(false));
} catch (NullPointerException e) {
searchAttribute.put("isOrderFailed", true);
Workflow.upsertTypedSearchAttributes(Constants.IS_ORDER_FAILED.valueSet(true));
throw new NullPointerException("Unable to get distance");
}
How to remove a Search Attribute from a Workflow
To remove a Search Attribute that was previously set, set it to an empty Map.
// In a shared constants file, so all files have access
public static final SearchAttributeKey<Boolean> IS_ORDER_FAILED = SearchAttributeKey.forBoolean("isOrderFailed");
...
Workflow.upsertTypedSearchAttributes(Constants.IS_ORDER_FAILED.valueUnset());