Exception in thread “main“ org.apache.flink.table.api.NoMatchingTableFactoryException

Today, when doing Flink Tab Api to read the data source from Kafka, an error was reported.Exception in thread "main" org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.DeserializationSchemaFactory' in the classpath

After consulting the information, it is known that the local data can still use the oldScv format descriptor, but for Kafka, because the oldScv format descriptor is non-standard, it is not universal for docking with external systems, so it will be abandoned. The current new description The device is called Csv(), but flink does not provide it directly. You need to manually import the dependency flink-csv:

<dependency>
 <groupId>org.apache.flink</groupId>
 <artifactId>flink-csv</artifactId>
 <version>1.10.0</version>
</dependency>

The version in maven can be set according to your own flink version, and everything is normal after the dependency is introduced

Insert picture description here

Guess you like

Origin blog.csdn.net/weixin_44080445/article/details/113250903