Right now I am trying to implement the mediapipe c++ library into my flutter app. At the beginning I wrote a simple function with a printf() function in it in c++ to try to to run it in flutter.
I compiled as follows:
g++ -shared -o produce_landmarks.so -fPIC produce_landmarks.cpp
I put the compiled shared .so file into each following structur folder: android/app/src/main/jniLibs
- /arm64-v8a
- /armeabi-v7a
- /x86
- /x86_64
Unfortunately I got an bad ELF error. Probably because I didn't compiled it for the correct processor architecture.
"data/app/.../lib/arm64/produce_landmarks.so" has bad ELF magic: 4d5a9000" //on phone "/data/app/.../lib/x86_64/produce_landmarks.so" has bad ELF magic: 4d5a9000 // on emulator
But do I really need to compile the c++ code, which is right now a simple function with a printf() function into different architectures? If that's the case then how can I do it?
I would like to know what the exact steps are to run
- a simple c++ function
- the mediapipe c++ library in flutter.
I am asking because I am just confused of the many ways that the ffi package was being used in youtube. Do I need to use ffigen? I saw many people create plugins? But I don't think that I need that because I don't want to access native language features like the camera and so one.
I would really appreciate if someone could help me.