Dataset Creation Issue with Arcadia Instant for KSQL

I did a quick test with Arcadia Instant 4.5 and the Confluent Platform Demo 5.1.0 (better known as cp-demo). Instead of showing only the available streams and tables that have been defined, both the new dataset creator and the connection explorer show all of the available topics on the Kafka cluster. This makes it difficult to navigate especially since internal topics are not filtered out and if the topic name doesn’t match the stream or table name I will likely get an error because it will try to use the topic name to fetch the metadata from KSQL.

The behavior I would expect would to either just show the available steams and tables or have a separate topic explorer and provide a form to create a new stream or table.

I tested with a some other streams backed by topics with the same name and everything worked as expected. Seeing the visuals update in real time is very cool and provides a nice face to stream processing.

Patrick,

Thanks for the feedback and suggested improvements. If you have any snapshots of errors while exploring topics please share them (if possible) and we’ll pass them along to our engineering team.

Thank you,
Tadd Wood

@Patrick_Druley Is the new dataset creation modal not sufficient? It does have a hierarchy of Topics -> Tables/Streams to make it easy find what you are looking for.

Also, the Dataset Source gives you option to create your own stream

Also, there is an “ALL” in Connection Explorer which narrows down all the streams/tables. Did you see that?

Here is the error message:

{code}

Error

Failed to load dataset table column detail Less

b’{"@type":“statement_error”,“error_code”:40001,“message”:“Could not find STREAM/TABLE ‘PARSED’ in the Metastore”,“stackTrace”:[“io.confluent.ksql.rest.server.resources.KsqlResource.describe(KsqlResource.java:424)”,“io.confluent.ksql.rest.server.resources.KsqlResource.validateStatement(KsqlResource.java:228)”,“io.confluent.ksql.rest.server.resources.KsqlResource.handleKsqlStatements(KsqlResource.java:181)”,“sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)”,“sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)”,“sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)”,“java.lang.reflect.Method.invoke(Method.java:498)”,“org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:76)”,“org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:148)”,“org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:191)”,“org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$ResponseOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:200)”,“org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:103)”,“org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:493)”,“org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:415)”,“org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:104)”,“org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:277)”,“org.glassfish.jersey.internal.Errors$1.call(Errors.java:272)”,“org.glassfish.jersey.internal.Errors$1.call(Errors.java:268)”,“org.glassfish.jersey.internal.Errors.process(Errors.java:316)”,“org.glassfish.jersey.internal.Errors.process(Errors.java:298)”,“org.glassfish.jersey.internal.Errors.process(Errors.java:268)”,“org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:289)”,“org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:256)”,“org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:703)”,“org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:416)”,“org.glassfish.jersey.servlet.ServletContainer.serviceImpl(ServletContainer.java:409)”,“org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:584)”,“org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:525)”,“org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:462)”,“org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)”,“org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533)”,“org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)”,“org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595)”,“org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)”,“org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1340)”,“org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)”,“org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473)”,“org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564)”,“org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)”,“org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1242)”,“org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)”,“org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:126)”,“org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:174)”,“org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)”,“org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:740)”,“org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)”,“org.eclipse.jetty.server.Server.handle(Server.java:503)”,“org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)”,“org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)”,“org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)”,“org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)”,“org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)”,“org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)”,“org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)”,“org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)”,“org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)”,“org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)”,“org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)”,“org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)”,“java.lang.Thread.run(Thread.java:748)”],“statementText”:“describe parsed;”,“entities”:[]}’
{code}

@Patrick_Druley - in the new dataset dialog what did you choose for the Topic and Table/Stream drop down?

If those are correct, then I don’t see why the system wouldn’t be able to pick up the correct stream/table.

Here’s an example of creating dataset from table with a name different from topic:

image

@shaun

Agree, it does work if the topic name is different as long as Arcadia Instant can parse the topic name correctly which in case of my topic that returned an error “wikipedia.parsed” it can not. Even if you get the topic name parsing fixed to handle using dots (.) I think the bigger question is why is it necessary to expose topics in the first place. I could understand exposing the topics if you allowed topic inspection or the ability to create streams or tables from topics in Arcadia Data but I don’t believe those capabilities currently exist. At a minimum I would highly encourage adding the ability filter out internal topics (usually start with an underscore) by default and add a toggle switch in case the user really wants to see them (we currently do this in Control Center).

It’s also worth noting that KSQL has no way to group streams or tables other than by topic which is why I am assuming you used the same interface that would normally be used by a database schema that groups tables and views together. My suggestion would be to actually show all of the streams and tables first by default and then when a user selects a stream or table then show the underlying topic and message format and perhaps some sample data. I could see a future where KSQL allows for a grouping object that would leverage this existing interface well but currently it doesn’t make much sense to me.

@Patrick_Druley have you seen the “Direct Access” option?

That should allow you to create streams/tables from Kafka topic as you indicated. See example below:

Now when I go to the topic I can see the new stream I created:

1 Like

@shaun Yeah, I like that Direct Access option as it provides flexibility for both custom queries and creating streams and tables as you demonstrated. However, if you look at if from a user perspective the information a need to create a stream or table on an existing topic isn’t really available in Arcadia yet (value format and value fields). I would need some way to do topic inspection or read from the Confluent Schema Registry to understand the messages that are currently in a topic. I tested doing a KSQL PRINT command in the Direct Access editor and it didn’t work. If you can get that to work, that would be a fast path to topic inspection.

1 Like

@Patrick_Druley agreed, that is a feature we should add to make the workflow easier.