Running Slack service and token streaming concurrently in Spring web flux
James K J
Posted on April 29, 2024
I have a function named sendSlackmessage which takes a string and sends a message to Slack. Now, I'm faced with a scenario where I need to process a large text, splitting it into batches of 10 words, and send each batch to Slack using sendSlackmessage. However, since calling sendSlackmessage is time-consuming, I'm exploring the possibility of running the Slack service and token streaming concurrently to optimize performance.
Here's a simplified version of my code:
private Flux
<StreamData> processStreamDataFlux(Flux
<Object> streamDataFlux, AtomicReference<StringBuilder> answer, ChatBotConversationRPCResponse response) {
AtomicBoolean hasAnswer = new AtomicBoolean(true);
return streamDataFlux
.flatMap(data -> {
var inputData = ((StreamData) data);
return Mono.just(inputData);
})
.doOnNext(streamData -> {
if (!streamData.getToken().isEmpty()) {
if (streamData.getIsAnswer().equals("True")) {
answer.updateAndGet(sb -> {
sb.append(streamData.getToken());
if (sb.length() >= 10) {
sendSlackMessage(sb.toString());
sb.delete(0, 10);
}
return sb;
});
} else {
logger.info(String.format("Sources URLs: %s", streamData.getToken()));
response.setUrlSources(extractLinks(streamData.getToken()));
}
}
})
.doOnComplete(() -> {
if (hasAnswer.get()) {
response.setAnswer(answer.get().toString());
response.setResponse(true);
} else {
response.setAnswer(chatBot.getNoAnswer());
response.setResponse(false);
}
});
}
In this code, processStreamDataFlux processes incoming stream data and calls sendSlackmessage for each batch of 10 words. However, I'm concerned about the impact of calling sendSlackmessage on the token streaming process. Is there a way to run the Slack service in a separate thread while ensuring smooth token streaming?
I'd appreciate any insights or suggestions on how to achieve this concurrency in Java. Thank you!
Posted on April 29, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.