Skip to content

[Bug] does fluss support kerberos authentication now? #328

@daihw

Description

@daihw

Search before asking

  • I searched in the issues and found nothing similar.

Fluss version

fluss 0.5 flink 1.20 hadoop3

Minimal reproduce step

server.yml
set remote.data.dir: remote.data.dir: hdfs://nameservice1/user/hive/warehouse/

When I use a Hadoop cluster with Keberos authentication as remote storage for fluss, the following error is reported

What doesn't meet your expectations?

2025-01-17 16:02:24,850 ERROR com.alibaba.fluss.server.kv.snapshot.PeriodicSnapshotManager [] - Fail to init snapshot during triggering snapshot.
org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_192]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_192]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_192]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_192]
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121) ~[?:?]
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88) ~[?:?]
at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2509) ~[?:?]
at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2483) ~[?:?]
at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1485) ~[?:?]
at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1482) ~[?:?]
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[?:?]
at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1499) ~[?:?]
at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1474) ~[?:?]
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2388) ~[?:?]
at com.alibaba.fluss.fs.hdfs.HadoopFileSystem.mkdirs(HadoopFileSystem.java:118) ~[?:?]
at com.alibaba.fluss.fs.PluginFileSystemWrapper$ClassLoaderFixingFileSystem.mkdirs(PluginFileSystemWrapper.java:122) ~[fluss-server-0.5.0.jar:0.5.0]
at com.alibaba.fluss.server.kv.snapshot.KvTabletSnapshotTarget.initSnapshotLocation(KvTabletSnapshotTarget.java:187) ~[fluss-server-0.5.0.jar:0.5.0]
at com.alibaba.fluss.server.kv.snapshot.KvTabletSnapshotTarget.initSnapshot(KvTabletSnapshotTarget.java:166) ~[fluss-server-0.5.0.jar:0.5.0]
at com.alibaba.fluss.server.kv.snapshot.PeriodicSnapshotManager.lambda$triggerSnapshot$1(PeriodicSnapshotManager.java:182) ~[fluss-server-0.5.0.jar:0.5.0]
at com.alibaba.fluss.utils.concurrent.LockUtils.inLock(LockUtils.java:31) ~[fluss-server-0.5.0.jar:0.5.0]
at com.alibaba.fluss.utils.concurrent.LockUtils.inWriteLock(LockUtils.java:59) ~[fluss-server-0.5.0.jar:0.5.0]
at com.alibaba.fluss.server.kv.KvTablet.lambda$getGuardedExecutor$2(KvTablet.java:452) ~[fluss-server-0.5.0.jar:0.5.0]
at com.alibaba.fluss.server.kv.snapshot.PeriodicSnapshotManager.triggerSnapshot(PeriodicSnapshotManager.java:175) ~[fluss-server-0.5.0.jar:0.5.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_192]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_192]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_192]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_192]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_192]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_192]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_192]
Caused by: org.apache.hadoop.ipc.RemoteException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1612) ~[?:?]
at org.apache.hadoop.ipc.Client.call(Client.java:1558) ~[?:?]
at org.apache.hadoop.ipc.Client.call(Client.java:1455) ~[?:?]
at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:242) ~[?:?]
at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:129) ~[?:?]
at com.sun.proxy.$Proxy26.mkdirs(Unknown Source) ~[?:?]
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:674) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_192]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_192]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_192]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_192]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) ~[?:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) ~[?:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) ~[?:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[?:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) ~[?:?]
at com.sun.proxy.$Proxy27.mkdirs(Unknown Source) ~[?:?]
at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2507) ~[?:?]
... 23 more

Anything else?

Did I miss any configurations?

Are you willing to submit a PR?

  • I'm willing to submit a PR!

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions