hw140701 / videoto3dposeandbvh Goto Github PK
View Code? Open in Web Editor NEWConvert video to the bvh motion file
Home Page: https://www.stubbornhuang.com/613
License: Other
Convert video to the bvh motion file
Home Page: https://www.stubbornhuang.com/613
License: Other
关于torchsample我是用您给的链接中的方法安装不上,错误如下:
pip install -e git+https://github.com/ncullen93/torchsample.git#egg=torchsample
Obtaining torchsample from git+https://github.com/ncullen93/torchsample.git#egg=torchsample
Cloning https://github.com/ncullen93/torchsample.git to d:\game\mmd\mmdmatic\src\torchsample
Running command git clone -q https://github.com/ncullen93/torchsample.git 'D:\game\MMD\mmdmatic\src\torchsample'
fatal: early EOF
fatal: the remote end hung up unexpectedly
fatal: index-pack failed
error: RPC failed; curl 56 OpenSSL SSL_read: SSL_ERROR_SYSCALL, errno 10054
ERROR: Command errored out with exit status 128: git clone -q https://github.com/ncullen93/torchsample.git 'D:\XXXXXX\src\torchsample' Check the logs for full command output.
我实在anaconad3的虚拟环境下使用这条命令的,能否告知这个torchsample手动安装的话,具体装在什么位置?谢谢了。
请问在计算方向余弦矩阵之前,你是怎么计算每个节点的x_dir,y_dir,z_dir以及定义order的。
In joints_detectors\Alphapose\fn.py, in line 219, stickwidth -> int(stickwidth)
樓主,
想問下怎麽使用openpose和hrnet模型來預測2d關鍵點呢?? 可以更新下嗎?
openpose用不了
hrnet卻報錯:
Traceback (most recent call last):
File "videopose.py", line 332, in
inference_video('outputs/inputvideo/video.avi', 'hr_pose')
File "videopose.py", line 195, in inference_video
main(args)
File "videopose.py", line 91, in main
model_pos = model_pos.cuda()
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 307, in cuda
return self._apply(lambda t: t.cuda(device))
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 203, in _apply
module._apply(fn)
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 225, in _apply
param_applied = fn(param)
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 307, in
return self._apply(lambda t: t.cuda(device))
RuntimeError: CUDA error: device-side assert triggered
可以幫忙下嗎, 謝謝您
Do you what's wrong with this? It seems strange....
Do you know whats the problem?
$ python videopose.py
the video is 25.051 f/s
Loading YOLO model..
outputs/inputvideo/kunkun_cut_one_second.mp4 --- elapsed time: 1.0411371119844262 s
Traceback (most recent call last):
File "videopose.py", line 332, in
inference_video('outputs/inputvideo/kunkun_cut_one_second.mp4', 'alpha_pose')
File "videopose.py", line 195, in inference_video
main(args)
File "videopose.py", line 74, in main
keypoints = detector_2d(video_name)
File "/mnt/sdb1/pose_estimation/3D_pose_estimation/VideoTo3dPoseAndBvh/joints_detectors/Alphapose/gene_npz.py", line 36, in generate_kpts
final_result, video_name = handle_video(video_file)
File "/mnt/sdb1/pose_estimation/3D_pose_estimation/VideoTo3dPoseAndBvh/joints_detectors/Alphapose/gene_npz.py", line 122, in handle_video
det_loader = DetectionLoader(data_loader, batchSize=args.detbatch).start()
File "/mnt/sdb1/pose_estimation/3D_pose_estimation/VideoTo3dPoseAndBvh/joints_detectors/Alphapose/dataloader.py", line 280, in init
self.det_model.cuda()
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 307, in cuda
return self._apply(lambda t: t.cuda(device))
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 203, in _apply
module._apply(fn)
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 203, in _apply
module._apply(fn)
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 203, in _apply
module._apply(fn)
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 225, in _apply
param_applied = fn(param)
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 307, in
return self._apply(lambda t: t.cuda(device))
RuntimeError: CUDA error: out of memory
camera.py源文件如下:
import h5py
import numpy as np
from pathlib import Path
def load_camera_params(file):
cam_file = Path(file)
cam_params = {}
azimuth = {
'54138969': 70, '55011271': -70, '58860488': 110, '60457274': -100
}
with h5py.File(cam_file) as f:
subjects = [1, 5, 6, 7, 8, 9, 11]
for s in subjects:
cam_params[f'S{s}'] = {}
for _, params in f[f'subject{s}'].items():
name = params['Name']
name = ''.join([chr(c) for c in name])
val = {}
val['R'] = np.array(params['R'])
val['T'] = np.array(params['T'])
val['c'] = np.array(params['c'])
val['f'] = np.array(params['f'])
val['k'] = np.array(params['k'])
val['p'] = np.array(params['p'])
val['azimuth'] = azimuth[name]
cam_params[f'S{s}'][name] = val
return cam_params
在运行语句cam_params = load_camera_params('cameras.h5')[subject][cam_id]
在这一行以及下面很多行都报错
---> 17 name = ''.join([chr(c) for c in name])
TypeError: only integer scalar arrays can be converted to a scalar index
我输入一段步行视频,bvh文件load出来发现骨架很抖。请问有什么处理方法吗
如题,不知道可行不。。。
Hello. Thank you for the work. Do you think its possible to write to .fbx format using the output of the SMPL model ? I have the following parameters :
pred_cam (n_frames, 3) # weak perspective camera parameters in cropped image space (s,tx,ty)
orig_cam (n_frames, 4) # weak perspective camera parameters in original image space (sx,sy,tx,ty)
verts (n_frames, 6890, 3) # SMPL mesh vertices
pose (n_frames, 72) # SMPL pose parameters
betas (n_frames, 10) # SMPL body shape parameters
joints3d (n_frames, 49, 3) # SMPL 3D joints
joints2d (n_frames, 21, 3) # 2D keypoint detections by STAF if pose tracking enabled otherwise None
bboxes (n_frames, 4) # bbox detections (cx,cy,w,h)
frame_ids (n_frames,) # frame ids in which subject with tracking id
我使用了一个text生成动作的模型:https://github.com/korrawe/guided-motion-diffusion
这个模型可以直接生成一个npy。我可以直接读取之后转换成bvh吗?
或者目前我们的模型支持openpose(25个关键点)导出BVH了吗?
你好,请问输出的3d point坐标是归一化之后的坐标吗,为什么会有大于1的值
您好,
请教,是否支持Trinity Speech-Gesture dataset的数据转化。或者有没有方法可以转换,谢谢~
例子:
HIERARCHY
ROOT Hips
{
OFFSET -14.64140 90.27770 -84.91600
CHANNELS 6 Xposition Yposition Zposition Zrotation Xrotation Yrotation
JOINT Spine
{
OFFSET 0.00000 13.20850 -1.60436
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT Spine1
{
OFFSET 0.00000 8.61716 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT Spine2
{
OFFSET 0.00000 8.61717 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT Spine3
{
OFFSET 0.00000 8.61717 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT Neck
{
OFFSET 0.00000 11.07920 1.10792
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT Neck1
{
OFFSET 0.00000 7.08032 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT Head
{
OFFSET 0.00000 7.08031 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
JOINT RightShoulder
{
OFFSET -0.01000 7.91373 5.19711
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightArm
{
OFFSET -18.41580 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightForeArm
{
OFFSET -29.11090 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightHand
{
OFFSET -24.65040 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightHandThumb1
{
OFFSET -5.35114 -0.85590 3.97906
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightHandThumb2
{
OFFSET -4.52789 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightHandThumb3
{
OFFSET -2.46976 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
JOINT RightHandIndex1
{
OFFSET -14.26970 0.00000 2.88139
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightHandIndex2
{
OFFSET -5.48836 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightHandIndex3
{
OFFSET -3.01859 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
JOINT RightHandMiddle1
{
OFFSET -14.26970 0.00000 -0.09147
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightHandMiddle2
{
OFFSET -6.17441 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightHandMiddle3
{
OFFSET -3.56743 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
JOINT RightHandRing1
{
OFFSET -12.69180 0.00000 -3.06433
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightHandRing2
{
OFFSET -5.62556 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightHandRing3
{
OFFSET -3.56744 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
JOINT RightHandPinky1
{
OFFSET -11.11390 0.00000 -6.03719
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightHandPinky2
{
OFFSET -4.52789 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightHandPinky3
{
OFFSET -2.46976 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
}
}
}
}
JOINT LeftShoulder
{
OFFSET 0.01000 7.91373 5.19711
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftArm
{
OFFSET 18.41580 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftForeArm
{
OFFSET 29.11090 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftHand
{
OFFSET 24.65040 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftHandThumb1
{
OFFSET 5.35114 -0.85590 3.97906
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftHandThumb2
{
OFFSET 4.52789 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftHandThumb3
{
OFFSET 2.46977 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
JOINT LeftHandIndex1
{
OFFSET 14.26970 0.00000 2.88139
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftHandIndex2
{
OFFSET 5.48836 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftHandIndex3
{
OFFSET 3.01859 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
JOINT LeftHandMiddle1
{
OFFSET 14.26970 0.00000 -0.09147
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftHandMiddle2
{
OFFSET 6.17440 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftHandMiddle3
{
OFFSET 3.56744 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
JOINT LeftHandRing1
{
OFFSET 12.69180 0.00000 -3.06433
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftHandRing2
{
OFFSET 5.62556 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftHandRing3
{
OFFSET 3.56744 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
JOINT LeftHandPinky1
{
OFFSET 11.11390 0.00000 -6.03719
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftHandPinky2
{
OFFSET 4.52789 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftHandPinky3
{
OFFSET 2.46976 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
}
}
}
}
JOINT pCube4
{
OFFSET 0.00000 10.91090 1.10792
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
}
}
JOINT RightUpLeg
{
OFFSET -9.98441 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightLeg
{
OFFSET 0.00000 -42.11790 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightFoot
{
OFFSET 0.00000 -45.00020 -0.00001
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightForeFoot
{
OFFSET 0.00000 -3.74820 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT RightToeBase
{
OFFSET 0.00000 0.06325 14.43120
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
}
}
JOINT LeftUpLeg
{
OFFSET 9.98441 0.00000 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftLeg
{
OFFSET 0.00000 -42.11800 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftFoot
{
OFFSET 0.00000 -45.00070 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftForeFoot
{
OFFSET 0.00000 -3.74824 0.00000
CHANNELS 3 Zrotation Xrotation Yrotation
JOINT LeftToeBase
{
OFFSET 0.00000 0.06325 14.43110
CHANNELS 3 Zrotation Xrotation Yrotation
End site
{
OFFSET 0.00000 0.00000 0.00000
}
}
}
}
}
}
}
MOTION
Frames:1
Frame Time: 0.050000
0.0 0.0 0.0 -0.0 0.0 -0.0 -1.0438600060281022 2.3587152053059897 -1.2019897431581434 -1.001710861815905 3.9499432452818337 0.4159642251569221 -1.0308159394841072 -1.0254198939760328 -0.3421914256452675 0.020989867445547768 0.19973630310387566 -0.618310852232598 -0.3197276373360509 12.097122524927356 0.18173158781933554 0.016670012435241553 -0.5045475503531174 -0.9459365635586078 0.14449033402431094 -15.811948793899363 -0.05198356166208762 -4.803187267192833 1.425229544967309 -14.953113151318428 67.19286454829422 10.776513302970162 13.750833647655174 22.399628423650412 0.5910102517395841 78.32902930570262 16.32197871348333 0.28157159400535886 -5.0503092453904825 4.52871 -0.12703 -1.59312 -4.64595e-06 9.01534e-06 -56.2342 8.23465e-06 -9.55045e-07 -56.2342 -38.7697 2.616e-06 -3.03728e-06 1.51694e-05 2.54582e-06 3.19778e-06 -3.36779e-06 6.40269e-06 -1.74067e-06 -36.6938 -3.44578e-06 4.82161e-06 1.03628e-05 2.25643e-06 -1.00237e-06 -1.59028e-15 3.18055e-15 1.90833e-14 -19.7583 1.99729e-06 9.82298e-07 1.19088e-05 -1.70535e-06 -1.47446e-06 -7.95139e-16 3.57812e-15 3.18055e-15 24.0542 -1.29415e-06 -2.03572e-07 28.8651 4.69993e-06 -2.49464e-06 28.8651 4.67541e-06 2.50252e-06 5.506066226025543 1.3766285140780876 13.34645761940873 -67.05098683202189 19.972927943281537 -15.991218926177313 -26.63707388373633 -1.732274591107232 -76.56005279831253 -12.684982116077768 0.25795658013984124 0.5589712097729244 -0.474211 0.0677175 -8.12381 -7.57701e-06 -7.52602e-07 44.5675 -2.34874e-06 -9.76287e-07 44.5675 12.4601 -8.06117e-07 4.96022e-06 -1.98259e-05 -5.83324e-07 -1.40447e-07 -3.18055e-15 1.59028e-15 -9.54166e-15 11.1059 -5.13129e-07 -3.38467e-06 3.12111e-07 1.50825e-07 1.16352e-06 5.56875e-08 -1.53149e-07 -1.37717e-07 4.77241 1.48801e-06 -1.13665e-06 -2.79682e-07 7.26536e-07 7.29834e-07 -1.95777e-07 5.08575e-07 5.10883e-07 -5.60077 7.82297e-07 -2.03231e-07 -6.72092 -3.47366e-06 4.68464e-06 -6.72092 -1.30631e-08 -7.87263e-07 -1.19271e-15 1.09925e-15 -3.18055e-15 8.04465 3.49397 -19.2195 -2.21424e-05 10.395 0.0142586 -5.62103 -15.6599 3.19225 3.97569e-16 7.95139e-16 -1.59028e-15 1.81331e-08 4.69756 -4.30096e-07 12.4594 -8.23857 6.88324 1.34357e-05 46.3794 0.00174233 -4.32522 -14.7485 -3.83531 3.18055e-15 -3.18055e-15 9.54166e-15 1.35867e-07 -10.745 -2.59575e-06
你好,我现有有coco骨架(18个关键点)的npy数据。但是看到在bvh_skeleton里面,cmu_skeleton和human3.6_skeleton两种骨架是有代码写好直接可以转成bvh文件的,但是coco却没有。请问我应该怎么样仿写呢?
后面这个有坐标的可视化是怎么完成的,我跑的时候不会出现这种图像
您好,请问输出的帧率是30fps,这个怎么替换成其他帧率啊?
I have 3d pose sequence file.
and i want to make this to bvh file with finger
How can i do that?
看了转换的代码,还是有一些问题,pose3d 是否能真正的转换成bvh?我感觉靠3d坐标,只能得到旋转角度,但比如手臂的自旋是如何计算的呢?比如一个T-pose,变成掌心向上,手臂的自旋角度计算是如何进行的,如果和3d模型绑定,会不会很奇怪啊?
outputs/inputvideo/video.avi --- elapsed time: 7.605630503036082 s
Traceback (most recent call last):
File "videopose.py", line 332, in
inference_video('outputs/inputvideo/video.avi', 'alpha_pose')
File "videopose.py", line 195, in inference_video
main(args)
File "videopose.py", line 74, in main
keypoints = detector_2d(video_name)
File "/mnt/sdb1/pose_estimation/3D_pose_estimation/VideoTo3dPoseAndBvh/joints_detectors/Alphapose/gene_npz.py", line 63, in generate_kpts
kpts = np.array(kpts).astype(np.float32)
ValueError: setting an array element with a sequence.
Can u help?
Thank you for your great works, I want to know whether this project support mutil person pose estimation? Please! Thank you!!
在调试过程中发现长时间无反应,甚至等待了一个小时之久,于是打断点进行debug。
最后发现程序在/joints_detectors/Alphapose/gene_npz.py中第190行的while循环体中一直循环,疑似进入死循环。具体部分代码如下:
while writer.running()
pass
writer.stop()
final_result = writer.results()
while中只有一句pass,而pass语句不执行任何操作,那么循环体应怎样跳出呢?为何要将writer.stop()放于循环体后呢?
尝试着进行了修改,但跳过循环后的输出bvh信息只有视频的第一帧。
在下学识浅薄,对多线程编程理解不深,如果方便的话请您留下邮箱方便交流。
默认的Alphapose检测出的BVH没有手部关键点,可以设置输出吗?openpose模型是可以设置脸部和手部关键点输出的,不知alphapose是否可以?或者需要重新改造?
I was wondering if there was a way to use 2 different camera angles to better improve the result, could be very helpful if there was a way.
Btw thanks for creating this, this was the only one that worked for me.
作者您好,我想请教一下,如果我想在GPU上运行程序而不是CPU上,请问有什么代码可以实现呀
I want to use the coco_skeleton. But it seems that its class is incomplete compared to the other two in bvh_skeleton
请问如何使用这个文件的骨架来生成BVH文件,感谢!!!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.