Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
2.0k views
in Technique[技术] by (71.8m points)

scala - Error when trying to write to hdfs: Server IPC version 9 cannot communicate with client version 4

I am trying a write a file to hdfs using scala and I keep getting the following error

Caused by: org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4
at org.apache.hadoop.ipc.Client.call(Client.java:1113)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
at bcomposes.twitter.Util$.<init>(TwitterStream.scala:39)
at bcomposes.twitter.Util$.<clinit>(TwitterStream.scala)
at bcomposes.twitter.StatusStreamer$.main(TwitterStream.scala:17)
at bcomposes.twitter.StatusStreamer.main(TwitterStream.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)

I installed hadoop following this tutorial. The code below is what I use to insert a sample file to hdfs.

val configuration = new Configuration();
val hdfs = FileSystem.get( new URI( "hdfs://192.168.11.153:54310" ), configuration );
val file = new Path("hdfs://192.168.11.153:54310/s2013/batch/table.html");
if ( hdfs.exists( file )) { hdfs.delete( file, true ); } 
val os = hdfs.create( file);
val br = new BufferedWriter( new OutputStreamWriter( os, "UTF-8" ) );
br.write("Hello World");
br.close();
hdfs.close();

The Hadoop version is 2.4.0 and hadoop library version I use is 1.2.1. What change should I do to make this work?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I had the same problem using Hadoop 2.3 and I've solved it adding the following lines to my build.sbt file :

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.3.0"

libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.3.0"

So I think in your case you case use the 2.4.0 version.

PS: It also worked on your code sample. I hope it will help


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...