Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to read csv output by OpenFaceOffline ? #25

Open
l-h-e opened this issue Jul 29, 2020 · 13 comments
Open

How to read csv output by OpenFaceOffline ? #25

l-h-e opened this issue Jul 29, 2020 · 13 comments

Comments

@l-h-e
Copy link

l-h-e commented Jul 29, 2020

OpenFaceOffline can output csv as a result of face recognition .
I can manipulate avatar's face on Unity3D by reading csv with main.py .
However , csv output by OpenFaceOffline has too argument to use main.py .

  1. Can I make main.py read csv output by OpenFaceOffline without format by change the code of main.py ?
  2. Is it same that csv output by OpenFaceOffline and csv can read by main.py (for example , demo.csv has only three gaze parameter , three pose parameter and AU1_r ~ AU45_r , but csv output by OpenFaceOffline has more parameters .) ?
@NumesSanguis
Copy link
Owner

This seems to be related to this issue: #23
Please check that issue for updates.

@l-h-e
Copy link
Author

l-h-e commented Jul 30, 2020

Thank you for your reply .
How about question 2 ?

@NumesSanguis
Copy link
Owner

If I remember correctly, FACSvatar is not using the other parameters. Therefore, the non-relevant columns have been removed from demo.csv to reduce file size.
If issue 1 is solved, main.py would just ignore the extra columns outputted by OpenFaceOffline.

@l-h-e
Copy link
Author

l-h-e commented Jul 31, 2020

Thank you .
I checked they are same by outputting received message in Unity3D .
However , they are slightly difference .
OpenFaceOffline's message includes "PoseTx , Ty , Tz" and "AU01~46" , but csv message has not .
I think the result as move of avatar must be same with same movie , however avatar's moving is a bit difference . (particular mouth moving is difference)
Is it specification or my setting issue ?

@NumesSanguis
Copy link
Owner

NumesSanguis commented Aug 1, 2020

About issue 1, I think you've put your OpenFace .csv in to FACSvatar\modules\input_facsfromcsv\openface\default_clean\your_recording.csv, right? .csv produces by OpenFace should not be put in a folder with a name ending in _clean. edited

I'm rewriting the documentation for FACSvatar v.0.4.0, and I hope these instructions will make it a bit clearer how to use your own files: https://facsvatar.readthedocs.io/en/v0.4.0/trackers/openface.html#use-your-own-videos

OpenFaceOffline's message includes "PoseTx , Ty , Tz"

That is correct, FACSvatar only uses "pose_Rx, Ry, Rz"

and "AU01~46" , but csv message has not .

Can you describe in more detail what you mean? FACSvatar uses the same AU values as OpenFace.

I think the result as move of avatar must be same with same movie , however avatar's moving is a bit difference . (particular mouth moving is difference)

OpenFace's facial expression tracking is not perfect, so some differences are to be expected.
It would be nice to use a better FACS tracker, however I'm not aware of an open FACS tracker that performs better than OpenFace.

@l-h-e
Copy link
Author

l-h-e commented Aug 2, 2020

Thank you for reply .

About issue 1, I think you've put your OpenFace .csv in to FACSvatar\modules\input_facsfromcsv\openface\default_clean\your_recording.csv, right? .csv produces by OpenFace should not be put in a folder with a name ending in _default.

I see.
However , the result is not difference .

Can you describe in more detail what you mean? FACSvatar uses the same AU values as OpenFace.

This is the output in Unity3D with Debug.Log(facsvatar); if I use OpenFaceOffline.

{
  "timestamp": 8.072677,
  "frame": 198,
  "confidence": 0.875,
  "pose": {
    "pose_Tx": 13.690295647423422,
    "pose_Ty": 114.7141430875774,
    "pose_Tz": 294.95225881900109,
    "pose_Rx": -0.12924998189751064,
    "pose_Ry": 0.188265749507212,
    "pose_Rz": -0.14415760728584548
  },
  "au_c": {
    "AU04": 1.0,
    "AU05": 0.0,
    "AU06": 0.0,
    "AU07": 1.0,
    "AU10": 0.0,
    "AU12": 0.0,
    "AU14": 0.0,
    "AU23": 0.0,
    "AU01": 1.0,
    "AU02": 0.0,
    "AU09": 0.0,
    "AU15": 0.0,
    "AU17": 0.0,
    "AU20": 0.0,
    "AU25": 1.0,
    "AU26": 1.0,
    "AU28": 0.0,
    "AU45": 1.0
  },
  "blendshapes": {
    "Expressions_abdomExpansion_max": 0.0,
    "Expressions_abdomExpansion_min": 0.0,
    "Expressions_browOutVertL_max": 0.21634,
    "Expressions_browOutVertL_min": 0.32781000000000005,
    "Expressions_browOutVertR_max": 0.21634,
    "Expressions_browOutVertR_min": 0.32781000000000005,
    "Expressions_browSqueezeL_max": 0.52522,
    "Expressions_browSqueezeL_min": 0.24038,
    "Expressions_browSqueezeR_max": 0.52522,
    "Expressions_browSqueezeR_min": 0.24038,
    "Expressions_browsMidVert_max": 0.4538,
    "Expressions_browsMidVert_min": 0.52522,
    "Expressions_cheekSneerL_max": 0.02729,
    "Expressions_cheekSneerR_max": 0.02729,
    "Expressions_chestExpansion_max": 0.0,
    "Expressions_chestExpansion_min": 0.0,
    "Expressions_eyeClosedL_max": 0.29848,
    "Expressions_deglutition_max": 0.0,
    "Expressions_deglutition_min": 0.0,
    "Expressions_eyeClosedL_min": 0.32605,
    "Expressions_eyeClosedPressureL_max": 0.00102,
    "Expressions_eyeClosedPressureL_min": 0.13042,
    "Expressions_eyeClosedPressureR_max": 0.00102,
    "Expressions_eyeClosedPressureR_min": 0.13042,
    "Expressions_eyeClosedR_max": 0.29848,
    "Expressions_eyeClosedR_min": 0.32605,
    "Expressions_eyeSquintL_max": 0.05217,
    "Expressions_eyeSquintL_min": 0.0,
    "Expressions_eyeSquintR_max": 0.05217,
    "Expressions_eyeSquintR_min": 0.0,
    "Expressions_eyesHoriz_max": 0.0,
    "Expressions_eyesHoriz_min": 0.15405,
    "Expressions_eyesVert_max": 0.02615,
    "Expressions_eyesVert_min": 0.0,
    "Expressions_jawHoriz_max": 0.0,
    "Expressions_jawHoriz_min": 0.0,
    "Expressions_jawOut_max": 0.0,
    "Expressions_jawOut_min": 0.0,
    "Expressions_mouthBite_max": 0.0,
    "Expressions_mouthBite_min": 0.0,
    "Expressions_mouthChew_max": 0.05557,
    "Expressions_mouthChew_min": 0.0,
    "Expressions_mouthClosed_max": 0.12968,
    "Expressions_mouthClosed_min": 0.54759,
    "Expressions_mouthHoriz_max": 0.0,
    "Expressions_mouthHoriz_min": 0.0,
    "Expressions_mouthInflated_max": 0.0,
    "Expressions_mouthInflated_min": 0.0,
    "Expressions_mouthLowerOut_max": 0.0,
    "Expressions_mouthLowerOut_min": 0.0,
    "Expressions_mouthOpenAggr_max": 0.00102,
    "Expressions_mouthOpenAggr_min": 0.00317,
    "Expressions_mouthOpenHalf_max": 0.0,
    "Expressions_mouthOpenLarge_max": 0.1945,
    "Expressions_mouthOpenLarge_min": 0.0,
    "Expressions_mouthOpenO_max": 0.0,
    "Expressions_mouthOpenO_min": 0.0,
    "Expressions_mouthOpenTeethClosed_max": 0.05476,
    "Expressions_mouthOpenTeethClosed_min": 0.0,
    "Expressions_mouthOpen_max": 0.0,
    "Expressions_mouthOpen_min": 0.0,
    "Expressions_mouthSmileL_max": 0.17632,
    "Expressions_mouthSmileOpen2_max": 0.0,
    "Expressions_mouthSmileOpen2_min": 0.0,
    "Expressions_mouthSmileOpen_max": 0.00563,
    "Expressions_mouthSmileOpen_min": 0.0,
    "Expressions_mouthSmileR_max": 0.17632,
    "Expressions_mouthSmile_max": 0.26172,
    "Expressions_mouthSmile_min": 0.0,
    "Expressions_nostrilsExpansion_max": 0.00732,
    "Expressions_nostrilsExpansion_min": 0.0,
    "Expressions_pupilsDilatation_max": 0.0,
    "Expressions_pupilsDilatation_min": 0.0,
    "Expressions_tongueHoriz_max": 0.0,
    "Expressions_tongueHoriz_min": 0.0,
    "Expressions_tongueOutPressure_max": 0.0,
    "Expressions_tongueOut_max": 0.0,
    "Expressions_tongueOut_min": 0.0,
    "Expressions_tongueTipUp_max": 0.0,
    "Expressions_tongueVert_max": 0.0,
    "Expressions_tongueVert_min": 0.0
  }
}

However , the item of "au_c" is not include with .csv .
This is the output in Unity3D if I use "python main.py" .

{
  "confidence": 0.98,
  "frame": 132,
  "timestamp": 4.4289999999999994,
  "pose": {
    "pose_Rx": -0.14850085089567011,
    "pose_Ry": 0.42489385359125481,
    "pose_Rz": -0.068709993021343674
  },
  "timestamp_utc": 15963537882826652,
  "blendshapes": {
    "Expressions_abdomExpansion_max": 0.0,
    "Expressions_abdomExpansion_min": 0.0,
    "Expressions_browOutVertL_max": 0.04829,
    "Expressions_browOutVertL_min": 1.70024,
    "Expressions_browOutVertR_max": 0.04829,
    "Expressions_browOutVertR_min": 1.70024,
    "Expressions_browSqueezeL_max": 2.78838,
    "Expressions_browSqueezeL_min": 0.05366,
    "Expressions_browSqueezeR_max": 2.78838,
    "Expressions_browSqueezeR_min": 0.05366,
    "Expressions_browsMidVert_max": 0.00543,
    "Expressions_browsMidVert_min": 2.78838,
    "Expressions_cheekSneerL_max": 1.92327,
    "Expressions_cheekSneerR_max": 1.92327,
    "Expressions_chestExpansion_max": 0.0,
    "Expressions_chestExpansion_min": 0.0,
    "Expressions_eyeClosedL_max": 0.02071,
    "Expressions_deglutition_max": 0.0,
    "Expressions_deglutition_min": 0.0,
    "Expressions_eyeClosedL_min": 0.00835,
    "Expressions_eyeClosedPressureL_max": 0.0067,
    "Expressions_eyeClosedPressureL_min": 0.00334,
    "Expressions_eyeClosedPressureR_max": 0.0067,
    "Expressions_eyeClosedPressureR_min": 0.00334,
    "Expressions_eyeClosedR_max": 0.02071,
    "Expressions_eyeClosedR_min": 0.00835,
    "Expressions_eyeSquintL_max": 0.00134,
    "Expressions_eyeSquintL_min": 0.0,
    "Expressions_eyeSquintR_max": 0.00134,
    "Expressions_eyeSquintR_min": 0.0,
    "Expressions_eyesHoriz_max": 0.0,
    "Expressions_eyesHoriz_min": 0.294,
    "Expressions_eyesVert_max": 0.0,
    "Expressions_eyesVert_min": 0.053,
    "Expressions_jawHoriz_max": 0.0,
    "Expressions_jawHoriz_min": 0.0,
    "Expressions_jawOut_max": 0.0,
    "Expressions_jawOut_min": 0.0,
    "Expressions_mouthBite_max": 0.0,
    "Expressions_mouthBite_min": 0.0,
    "Expressions_mouthChew_max": 0.03772,
    "Expressions_mouthChew_min": 0.0,
    "Expressions_mouthClosed_max": 0.05882,
    "Expressions_mouthClosed_min": 0.17891,
    "Expressions_mouthHoriz_max": 0.0,
    "Expressions_mouthHoriz_min": 0.0,
    "Expressions_mouthInflated_max": 0.0,
    "Expressions_mouthInflated_min": 0.0,
    "Expressions_mouthLowerOut_max": 0.0,
    "Expressions_mouthLowerOut_min": 0.0174,
    "Expressions_mouthOpenAggr_max": 0.0067,
    "Expressions_mouthOpenAggr_min": 1.1121,
    "Expressions_mouthOpenHalf_max": 0.0,
    "Expressions_mouthOpenLarge_max": 0.13202,
    "Expressions_mouthOpenLarge_min": 0.0,
    "Expressions_mouthOpenO_max": 0.0,
    "Expressions_mouthOpenO_min": 0.0,
    "Expressions_mouthOpenTeethClosed_max": 0.01789,
    "Expressions_mouthOpenTeethClosed_min": 0.0,
    "Expressions_mouthOpen_max": 0.0,
    "Expressions_mouthOpen_min": 0.0,
    "Expressions_mouthSmileL_max": 0.03529,
    "Expressions_mouthSmileOpen2_max": 0.0,
    "Expressions_mouthSmileOpen2_min": 0.05219,
    "Expressions_mouthSmileOpen_max": 0.0,
    "Expressions_mouthSmileOpen_min": 0.0,
    "Expressions_mouthSmileR_max": 0.03529,
    "Expressions_mouthSmile_max": 0.02721,
    "Expressions_mouthSmile_min": 0.23287,
    "Expressions_nostrilsExpansion_max": 1.8669,
    "Expressions_nostrilsExpansion_min": 0.0,
    "Expressions_pupilsDilatation_max": 0.0,
    "Expressions_pupilsDilatation_min": 0.0,
    "Expressions_tongueHoriz_max": 0.0,
    "Expressions_tongueHoriz_min": 0.0,
    "Expressions_tongueOutPressure_max": 0.0,
    "Expressions_tongueOut_max": 0.0,
    "Expressions_tongueOut_min": 0.0,
    "Expressions_tongueTipUp_max": 0.0,
    "Expressions_tongueVert_max": 0.0,
    "Expressions_tongueVert_min": 0.0
  }
}

There are a bit difference .

@NumesSanguis
Copy link
Owner

Thank you for the detailed log!

  • "au_c" means classification if that AU is active or not. Basically it means that if au_r >= 0.5 then it's 1, else 0. This value is not used in FACSvatar.
  • "pose_Tx" is also ignored.

There is some slight difference in using OpenFaceOffline for real-time purposes, or when using the .csv. In general, the .csv is more accurate and therefore preferred when real-time is not necessary. Apparently, OpenFace does some post-processing after analysing the whole file, when it writes the results to a .csv file. This does not happen when using it in real-time.

@l-h-e
Copy link
Author

l-h-e commented Aug 4, 2020

Thank you for reply .
I see .
and I have another question that how to use 2 model with csv input .
in default status , I can receive following message

Received message: ['openface.p0.somefile, 15965343864475856, {'confidence': 0.98, 'frame': 22, 'timestamp': 0.738, 'au_r': {'AU01': 0.09726842766153403, 'AU02': 0.034993200875877276, 'AU04': 3.0209638855816707, 'AU05': 0.006224593312018546, 'AU06': 0.0, 'AU07': 2.0982603297049134, 'AU09': 0.027994560700701818, 'AU10': 1.5865665697855156, 'AU12': 0.0, 'AU14': 0.0, 'AU15': 0.04173519116644401, 'AU17': 0.23696478524592737, 'AU20': 0.03250339956206137, 'AU23': 0.03250339956206137, 'AU25': 0.1875066408505329, 'AU26': 0.18623398961520926, 'AU45': 0.0, 'AU61': 0.268, 'AU62': 0, 'AU63': 0, 'AU64': 0.017}, 'pose': {'pose_Rx': -0.15426555630457997, 'pose_Ry': 0.4306318043153648, 'pose_Rz': -0.05582972387846044}, 'timestamp_utc': 15965343864141326}]

I want to change "p0" to "p1" .
What should I do ?

@NumesSanguis
Copy link
Owner

NumesSanguis commented Aug 5, 2020

Do you want 2 models with the same facial expressions (1 .csv), or 2 models with different facial expressions (2 .csv)?

In case of 2 .csv, you need to follow this naming scheme:

Let me give you later some more detailed instructions.

@l-h-e
Copy link
Author

l-h-e commented Aug 5, 2020

Thank you for reply .
I used following command .

python main.py --csv_arg 2people_60fps_p1.csv --pub_ip facsvatar_bridge .

and My UnityMainThreadDispatcher's Participants state is "Users_2_models_2".
However , The woman avatar only moves .
Docker received message "Received message: ['openface.p0.2people_60fps_p1' ... " .
What should I do ?

@NumesSanguis
Copy link
Owner

I'll get back to you, but I believe changing p1 to p* does the trick. So it would be:
python main.py --csv_arg 2people_60fps_p*.csv --pub_ip facsvatar_bridge

Then it sends the facial data of p0.csv to the 1st avatar, and p1.csv to the 2nd avatar.

@l-h-e
Copy link
Author

l-h-e commented Aug 6, 2020

Thank you for reply .
I see.
I tried this command and I can make 2 avatars move .

python main.py --csv_arg 2people_60fps_p*.csv --pub_ip facsvatar_bridge

@NumesSanguis
Copy link
Owner

Great that you could confirm that :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants