I have MapR cluster with java 17 but spark is keep on failing with below error: can someone please let me know how to solve this.
java.lang.reflect.InaccessibleObjectException: Unable to make protected final java.lang.Class java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain) throws java.lang.ClassFormatError accessible: module java.base does not "opens java.lang" to unnamed module @29c62025 at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:354) at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297) at java.base/java.lang.reflect.Method.checkCanSetAccessible(Method.java:199)
I had the same issue with opentsdb with java openjdk version 17. It works with java openjdk 11. As a workarround I set the JAVA_HOME manually to use java openjdk version 11.