dongjk / faster_rcnn_keras Goto Github PK
View Code? Open in Web Editor NEWfaster RCNN keras step by step implementation
faster RCNN keras step by step implementation
full_labels = unmap(labels, total_anchors, inds_inside, fill=-1)
batch_label_targets = full_labels.reshape(-1, 1, 1, 1 * k)[batch_inds]
bbox_targets = np.zeros((len(inds_inside), 4), dtype=np.float32)
# bbox_targets = bbox_transform(anchors, gt_boxes[argmax_overlaps, :]
pos_anchors = all_anchors[inds_inside[labels == 1]]
bbox_targets = bbox_transform(pos_anchors, gt_boxes[argmax_overlaps, :][labels == 1])
bbox_targets = unmap(bbox_targets, total_anchors, inds_inside[labels == 1], fill=0)
batch_bbox_targets = bbox_targets.reshape(-1, 1, 1, 4 * k)[batch_inds]
padded_fcmap = np.pad(feature_map, ((0, 0), (1, 1), (1, 1), (0, 0)), mode='constant')
I found something confusing, the c.any
is always False
. and I check batch_bbox_targets
, it's filled with 0, the only thing I know for sure is that, before:
batch_bbox_targets = bbox_targets.reshape(-1, 1, 1, 4 * k)[batch_inds]
is executed, bbox_targets.reshape(-1, 1, 1, 4 * k).any()
is True
Looking forward to your suggestion, Thx!
I just read your post about faster RCNN, this is amazing and very clearly , it's help me a lot in my work. I want to extract region from RPN model but i training model very long . Would you share me weight RPN model . Thank you so much.
why always this number (1536) used as fixed number and it equal to what?
is this number equal to filters to colour deeps or what?
sorry, but I am new in this domain
I am trying to understand training process of RPN. I have problem with creating mini batches of 256 anchors. If features map has shape 18x25=450 and every position has 9 anchors it is 4050 potential anchors. Output class shape will be 18x25x18 and output regression shape will be 18x25x72. How to select only 256 anchors? I read that we have to select 128 fg and 128 bg randomly. If we label fg anchors with [1,0] and bg anchors with [0,1], how to label anchors which should be ignore? With [0,0]? I don't think [0,0] will prevent backpropagation of loss over anchors which should be ignored.
For example, if we have 9 anchors for same position on features map and only 1 of 9 anchors is fg, while others are bg (based on iou calculations). My question is how to discard 7 other anchors, ie. how to make 50:50 fg:bg ratio (in this example 1fg, 1bg and 7 discarded anchors)? If we can't change the output shape, we have to somehow discard redundant bg anchors. We don't want RPN to backpropagate loss over all of 9 anchors. It will be biased to bg anchors.
Please correct me if I am wrong and tell me what is wrong.
what are the changes should we make to train on own dataset? please guide to do that
i dont understand these lines in your blog:
prepare mini batch data
with the down sampled anchors, we now need calculate feature map point position for each anchor sample, and use that position to form our mini-batch. for example, an anchor sample have index 150, divided by 9, get integer 16, this 16 represented a point (1,2), the second row, third column point in feature map.
batch_inds=inds_inside[labels!=-1]
batch_inds=(batch_inds / k).astype(np.int)
why we are dividing by 'k' ,please let me help in understanding
my outpput is this from anchor boxes and ground truth boxes:
anchor box shape is: (395, 9, 4)
N = 395
ground truth box shape is: (2, 4)
K = 2
So how do i calculate the overlaps between them? Any one help me. Here I did in this way:
but it is creating problem later on in bbox_transform .
gt_boxes[argmax_overlaps, :][labels == 1] shape is (18, 4)
pos_anchors shape is (18, 9, 4)
this is later one result.
Following this article would be much, much, much, much, much clearer with some output examples pasted along the code snippets.
E.g. this:
padded_fcmap=np.pad(feature_map,((0,0),(1,1),(1,1),(0,0)),mode='constant')
padded_fcmap=np.squeeze(padded_fcmap)
batch_tiles=[]
for ind in batch_inds:
x = ind % width
y = int(ind/width)
fc_3x3=padded_fcmap[y:y+3,x:x+3,:]
batch_tiles.append(fc_3x3)
What is the output of batch_tiles
supposed to resemble? This would really ease the interpretation if we could have some pasted samples of output to follow along.
Hello there, i've done training both RPN and RCNN, and i would like to ask, how to use model of rcnn, the ouput of the model is score and bbox, where the bbox is transform format(bbox_transform), i've tried using bbox_transform_inv with the input "all_anchors" and bbox from the model but the result still wrong and i can't figure it out, why ? because bbox value became high positive or negative value, how to transform back to 'box' format and figure it out, thanks
here the example of the output:
[[-2.06265278e+01 -2.90417881e+01 4.07944298e+01 2.49541988e+01]
[ 1.62567091e+00 9.09532261e+00 2.32062912e+01 7.29226589e-01]
[ 5.22885227e+00 -2.84000320e+01 -1.52643404e+01 5.47946501e+00]
[-8.79242897e+00 3.88164520e-02 1.08110008e+01 1.45621538e+00]
[ 5.11084557e+00 1.62669678e+01 1.31373644e+01 -6.32110691e+00]
[ 1.93882313e+01 -1.42370987e+01 -3.54342747e+00 -1.59158049e+01]
[-3.17380466e+01 -9.74339390e+00 -6.12756968e+00 1.19106941e+01]
[-4.58960533e+00 1.00035973e+01 1.29482708e+01 1.56503105e+00]
[-1.44369049e+01 2.17501259e+01 1.91259995e+01 2.62187557e+01]
[ 6.41727018e+00 1.89460793e+01 1.08964367e+01 2.20485687e+00]
[ 1.96513882e+01 -1.47549334e+01 9.54520702e+00 7.58097267e+00]
[-2.49597931e+01 -7.45701456e+00 2.45447216e+01 1.39055519e+01]
[ 4.59815292e+01 -1.63944206e+01 -3.88751459e+00 2.62286263e+01]
[-1.15779409e+01 -2.13902016e+01 1.61578941e+01 8.84201431e+00]
[-2.81816196e+01 -1.31887712e+01 1.99995556e+01 9.78964615e+00]
[-6.61824703e+00 9.05736256e+00 7.66280937e+00 -3.76174498e+00]
[ 7.84473801e+00 5.73573303e+00 1.83825951e+01 7.65380478e+00]
[-8.90626907e-02 2.52789307e+01 1.62431049e+01 -2.31389198e+01]
[-1.26966763e+01 3.03341064e+01 2.20400620e+01 4.78993607e+00]
[ 2.69762650e+01 2.65424347e+00 7.71452713e+00 -6.97057581e+00]
[ 1.60277500e+01 1.20451441e+01 1.32827864e+01 6.35036087e+00]
[ 9.55057335e+00 -1.78789940e+01 2.32457089e+00 7.98925209e+00]
[-1.58132668e+01 4.20994568e+01 2.41683598e+01 -9.10912514e+00]
[-8.76365471e+00 -1.72816277e+01 -2.46611156e+01 2.54118843e+01]
[ 1.74420624e+01 1.01519690e+01 1.27691832e+01 -2.12998466e+01]
[-2.38134270e+01 6.05399323e+00 9.34597874e+00 -2.20235348e-01]
[ 1.60610085e+01 1.37900572e+01 -2.42706413e+01 -4.17082453e+00]
[ 1.60180130e+01 -1.08507805e+01 4.54763107e+01 -8.51396656e+00]
[-4.18583107e+00 -1.07547846e+01 -4.28413093e-01 2.25683403e+01]
[-1.30901928e+01 2.86955395e+01 1.22368784e+01 2.81098690e+01]
[ 1.20341396e+01 -9.97926140e+00 3.07621479e+00 -6.10108376e+00]
[ 1.91136570e+01 3.62575111e+01 2.23474865e+01 -1.21735344e+01]
[-3.78589668e+01 5.52252150e+00 1.10084066e+01 -1.46700611e+01]
[-1.58746653e+01 -4.88495827e+00 -1.45989761e+01 -3.29151678e+00]
[-2.65787625e+00 -1.53239155e+01 -8.50374699e+00 6.97214890e+00]
[-1.95937672e+01 3.41852188e+00 2.20483136e+00 8.77031708e+00]
[-2.56555786e+01 9.26528549e+00 -5.03830767e+00 -1.79322853e+01]
[ 2.53718233e+00 1.22436409e+01 1.72604370e+01 6.99793386e+00]
[ 4.06809807e+00 1.31606865e+01 1.71905518e+01 -2.85897522e+01]
[ 1.09103384e+01 -1.17121363e+00 -1.77140732e+01 -9.50297737e+00]
[-1.69283628e+00 -1.73882866e+01 1.50500031e+01 9.43029213e+00]
[-2.59532490e+01 -5.57506275e+00 -1.27909527e+01 -3.12677145e+00]
[ 5.85928583e+00 3.09311795e+00 1.44104586e+01 -5.41480207e+00]
[ 1.11001816e+01 1.05207939e+01 -5.08982086e+00 2.57757359e+01]
[ 8.64583302e+00 1.40349221e+00 -3.57421150e+01 4.20639658e+00]
[ 2.06394615e+01 1.77725067e+01 -1.40788784e+01 2.12706566e+01]
[ 1.53030796e+01 9.11560059e-02 -4.13422441e+00 7.77760220e+00]
[ 2.20285206e+01 -1.78933010e+01 -3.24855156e+01 1.11828671e+01]
[ 3.45003605e-02 -2.20023594e+01 3.18113174e+01 1.22378702e+01]
[ 1.23573275e+01 -1.38355799e+01 6.87291145e-01 -3.65642319e+01]
[ 1.26581478e+01 5.63907623e+00 -5.58917046e+00 -3.27587624e+01]
[-2.47371979e+01 2.13000278e+01 3.62949705e+00 1.75886192e+01]
[-2.12825832e+01 -4.68225002e-01 -1.05977936e+01 4.98907709e+00]
[-1.60728416e+01 5.08582783e+00 -8.87197399e+00 3.17428703e+01]
[-3.42208405e+01 -2.27608738e+01 -1.99556217e+01 -3.88715172e+00]
[-6.37220860e+00 1.88723862e+00 1.48059301e+01 2.24504972e+00]
[ 4.21745872e+00 2.03796272e+01 7.88794184e+00 -1.93918095e+01]
[-1.66200256e+00 -7.80647755e+00 1.77254248e+00 -5.16499949e+00]
[ 4.34796143e+01 -1.45930538e+01 4.73239470e+00 1.05581551e+01]
[-7.07003927e+00 8.49991417e+00 -1.40374231e+01 -2.09618835e+01]
[-1.99530220e+00 1.81494141e+01 2.63823700e+00 -3.15938797e+01]
[-3.00670834e+01 -1.50360737e+01 1.33986301e+01 1.81569061e+01]
[-1.98947582e+01 -6.92561865e+00 3.03887916e+00 4.66833448e+00]
[-1.31479521e+01 2.54492035e+01 -2.43505859e+01 -2.71949100e+01]
[-2.71621552e+01 5.61091852e+00 -1.17839365e+01 -1.01790667e+00]
[ 2.52452230e+00 9.86784363e+00 1.93379688e+00 1.77585888e+00]
[ 1.08017950e+01 -8.49272728e+00 7.08114481e+00 -1.55343885e+01]
[ 2.02185802e+01 -5.20953417e-01 3.02986031e+01 -1.32682681e-01]
[-1.57997160e+01 -5.62446499e+00 -9.11291790e+00 -1.01782970e+01]
[ 1.99370232e+01 1.92993984e+01 2.77917728e+01 -2.28918018e+01]
[-3.00279312e+01 -1.26846504e+00 -2.08283520e+01 -1.32156553e+01]
[ 3.93276501e+00 1.94628859e+00 1.54592438e+01 -1.99921932e+01]
[ 1.53853436e+01 -5.75450563e+00 1.68453217e+01 -2.03911667e+01]
[-1.06053829e-01 -1.46453896e+01 3.04325867e+00 -2.49733086e+01]
[ 2.84858742e+01 -6.70212460e+00 -3.28537865e+01 -3.46892786e+00]
[-3.13147469e+01 1.93892822e+01 -1.62992115e+01 -3.05912399e+01]
[-1.18755140e+01 7.12878227e+00 4.97267962e-01 -1.30484784e+00]
[ 1.36628103e+01 -1.44974766e+01 -1.56676092e+01 1.20003281e+01]
[-6.84868336e-01 9.46141434e+00 -2.29251504e+00 -4.86994600e+00]
[ 1.25102119e+01 9.05077457e+00 6.33800983e-01 8.96958637e+00]
[-1.05859985e+01 -3.20511131e+01 2.40000572e+01 1.75320888e+00]
[-2.59381056e+00 2.36859703e+01 3.32202873e+01 -2.01698055e+01]
[-2.53794360e+00 3.82069969e+00 -2.65382690e+01 1.53089972e+01]
[ 5.11644554e+00 3.63876843e+00 5.14221764e+00 1.31915894e+01]
[ 3.28244209e+00 7.87049532e+00 2.69283533e+00 1.01090240e+01]
[-1.99264030e+01 -9.67034245e+00 2.63487434e+00 1.34040718e+01]
[ 1.77897568e+01 -3.02934647e-03 -8.03396034e+00 9.75190926e+00]
[ 1.54270401e+01 7.88619518e+00 -2.52248859e+01 -2.31376247e+01]
[-9.03828049e+00 1.48714027e+01 2.32924366e+00 -6.05805683e+00]
[-4.12043381e+00 -1.13523121e+01 1.46896219e+00 1.95736275e+01]
[-2.64479184e+00 -2.71880417e+01 8.52537060e+00 2.50925875e+00]
[ 9.12898064e+00 -6.72820568e-01 -1.84285679e+01 -1.32050915e+01]
[-3.94303169e+01 1.04073792e+01 -9.57496643e-01 -9.45750237e+00]
[-2.21778526e+01 -3.59094429e+01 -2.04762077e+01 -1.46300488e+01]
[ 2.99226074e+01 -2.92488813e+00 4.93066072e-01 2.72150230e+01]
[-8.57196236e+00 1.83543015e+01 -5.70362282e+00 1.24158421e+01]
[ 1.11614285e+01 -1.16512318e+01 5.16784382e+00 1.15113735e+00]
[-1.05167465e+01 5.82191944e+00 -6.88554955e+00 1.99065285e+01]
[-2.35193157e+01 -1.50745177e+00 -5.01455021e+00 -9.24168110e+00]
[-2.43324757e+00 1.25330305e+01 -1.36760492e+01 -2.65534377e+00]
[-4.06817484e+00 -1.85885487e+01 -3.01529007e+01 7.76705742e+00]
[-2.24220486e+01 -1.46913409e+00 1.48737574e+01 1.87174320e+01]
[ 1.19579468e+01 -3.26625490e+00 1.77153530e+01 2.16761055e+01]
[-3.58425236e+00 2.29178505e+01 1.16974831e-01 4.72650003e+00]
[-2.51738777e+01 2.64596510e+00 1.42777157e+00 -1.64129972e-01]
[ 8.85254002e+00 1.05576344e+01 1.93465471e+00 1.65240860e+01]
[-1.86390171e+01 -5.59190750e+00 2.44524479e-01 1.12873745e+01]
[-5.70671701e+00 -1.64081573e+01 3.32451973e+01 1.42151375e+01]
[-8.06047821e+00 -1.06355047e+01 -5.88465166e+00 7.70208418e-01]
[ 4.01449299e+00 -1.06868153e+01 7.20469856e+00 -3.30455208e+01]
[ 2.08964844e+01 -6.79812050e+00 -2.72099247e+01 1.58625641e+01]
[ 1.82464104e+01 -1.41323891e+01 -1.82462997e+01 5.80139160e+00]
[ 1.05649185e+01 -2.17260094e+01 -4.67952728e-01 2.10373974e+01]
[ 2.28185120e+01 1.49749393e+01 -1.26778250e+01 2.92946739e+01]
[-2.02501392e+00 -1.35289192e+01 -2.66412306e+00 1.24479761e+01]
[-3.51350784e-01 2.59152527e+01 -1.62154350e+01 2.54646015e+00]
[-9.32517624e+00 -1.09843245e+01 -1.82040749e+01 1.16592693e+01]
[ 1.09934199e+00 2.11267281e+01 -9.17308521e+00 5.98384142e+00]
[ 8.90635872e+00 -4.86824512e+00 -3.60652771e+01 6.31729603e-01]
[-2.78498292e+00 -2.11736279e+01 -2.10900803e+01 1.49898887e-01]
[ 1.59142857e+01 -1.42541618e+01 -4.28925705e+00 1.03488035e+01]
[ 6.49266911e+00 -6.02750206e+00 5.67045212e+00 -6.65791750e-01]
[-1.42746696e+01 1.55785251e+00 -2.78358793e+00 -5.60324860e+00]
[ 3.90716696e+00 -6.57288551e-01 -5.98617744e+00 1.40519581e+01]
[-9.84055805e+00 -1.51533384e+01 3.00966911e+01 -2.84340134e+01]
[ 3.12007961e+01 -3.52915287e+00 -6.83337975e+00 1.25210600e+01]
[ 2.55123672e+01 7.37008095e+00 -1.29847889e+01 -8.47499180e+00]
[ 3.70332870e+01 1.00357752e+01 -8.34461880e+00 6.66151810e+00]
[-1.27090168e+01 -1.47494602e+00 2.26974030e+01 -1.53345490e+00]
[-1.37035694e+01 9.31728935e+00 2.32966270e+01 -1.05412769e+01]
[ 2.43966980e+01 -1.10440941e+01 1.31920137e+01 -1.64916439e+01]
[ 7.85769463e-01 -1.23669605e+01 4.44545507e+00 -4.99441099e+00]
[ 9.99106789e+00 -1.40073338e+01 1.47428198e+01 -1.37511635e+01]
[-7.80899048e-01 8.60315990e+00 2.54462357e+01 -9.79767227e+00]
[-1.69488049e+01 1.00592604e+01 -3.05456734e+00 4.43933868e+00]
[ 1.35445232e+01 -1.30903473e+01 -9.54470444e+00 -4.93997002e+00]
[ 1.67202415e+01 2.97171354e-01 1.04634533e+01 6.55204678e+00]
[-1.35370770e+01 -2.16899338e+01 -3.25351143e+00 -4.96232510e-01]
[ 2.64583378e+01 1.48113861e+01 8.59098434e-02 -1.21290684e+01]
[-5.68574524e+00 3.37738457e+01 6.10242271e+00 -2.05709000e+01]
[-2.66041946e+00 6.16599274e+00 1.16397820e+01 -4.37285900e+00]
[-1.20242481e+01 2.38685093e+01 8.46934032e+00 1.65273972e+01]
[-3.38336945e+01 -2.33910370e+01 3.64274597e+01 2.83297968e+00]
[-3.19304123e+01 -3.36854982e+00 -1.21547098e+01 1.27229538e+01]
[-2.26036148e+01 -1.51568718e+01 -3.12383347e+01 1.60241299e+01]
[-8.35206985e+00 5.28707314e+00 -7.27585506e+00 -2.40449505e+01]
[ 3.62347298e+01 -6.60408783e+00 -3.64447904e+00 -6.15608072e+00]
[-9.73602486e+00 -1.10780535e+01 1.36630020e+01 1.04698000e+01]
[-5.95400953e+00 -2.51786947e+00 -1.33984013e+01 1.30770588e+00]
[-2.78627777e+01 7.42147064e+00 4.74635553e+00 1.61567459e+01]
[-2.07280006e+01 -1.23294477e+01 -7.84477139e+00 2.71289301e+00]
[-1.61122379e+01 1.61857700e+01 -3.35216599e+01 -4.09674606e+01]
[ 2.58045311e+01 -1.29408274e+01 1.02003622e+01 2.50833149e+01]
[ 1.84239349e+01 2.47956562e+00 1.49199018e+01 -1.61182332e+00]
[-2.80294609e+01 -5.42108393e+00 -1.56443186e+01 -1.05065966e+01]
[ 2.07592392e+00 1.93909569e+01 -1.12358093e+00 2.06704483e+01]
[ 2.16178436e+01 -7.33679390e+00 1.73929195e+01 8.99847031e+00]
[-2.52988834e+01 -1.10992222e+01 -1.99103508e+01 -1.16963863e-01]
[ 1.15277958e+01 -5.77752972e+00 1.40498209e+00 -5.18552399e+00]
[-3.27203560e+00 -3.27318420e+01 -1.55212021e+01 1.00328751e+01]
[ 5.84209776e+00 -3.39023056e+01 3.71398735e+00 1.85512295e+01]
[-9.61000919e-01 5.69345713e-01 -6.45574284e+00 -8.47114372e+00]
[ 4.48262072e+00 -1.04902458e+01 2.87705612e+01 2.13196526e+01]
[-1.16753221e+00 1.68934860e+01 8.05399895e+00 -6.01839447e+00]
[ 1.79960976e+01 -2.17672997e+01 1.21129932e+01 7.88831711e+00]
[-1.25387974e+01 -2.54090214e+01 -7.96282864e+00 2.51256714e+01]
[ 2.14872169e+00 -2.88355045e+01 1.20757103e+01 4.24790001e+01]
[ 2.17153511e+01 -3.06203651e+01 1.72938728e+01 -1.60865974e+01]
[ 2.54389501e+00 -8.35985422e-01 1.30648031e+01 -4.56465721e+00]
[-5.96182108e+00 7.14096594e+00 9.99941158e+00 -1.28208208e+01]
[ 9.13703442e-01 3.08263149e+01 -1.21695595e+01 2.11448812e+00]
[ 1.06483307e+01 3.06128845e+01 -9.83990288e+00 6.41876936e-01]
[-1.72132645e+01 -5.46452904e+00 -7.77397108e+00 -2.26363144e+01]
[-8.77066898e+00 2.08808231e+01 3.82699165e+01 -4.56157446e+00]
[ 3.05853891e+00 7.97471619e+00 -2.43193150e+01 -1.36093845e+01]
[-1.74715710e+01 -1.35196676e+01 2.83831673e+01 -4.23387098e+00]
[-1.31308346e+01 1.53725557e+01 4.45657444e+00 3.54858351e+00]
[-2.65252209e+01 2.10294571e+01 -6.44234896e+00 -5.32297564e+00]
[ 4.82135296e+00 3.11434937e+00 7.83245754e+00 -7.90870667e+00]
[ 2.63197346e+01 -8.06696129e+00 -1.53155079e+01 -2.18475533e+01]
[ 2.09771137e+01 -2.99464359e+01 1.18533230e+00 1.04121494e+01]
[ 1.23303556e+00 1.17938118e+01 -1.53217382e+01 5.45040751e+00]
[-3.30760288e+00 -7.19704199e+00 2.93374557e+01 -6.44288158e+00]
[ 8.83793259e+00 1.46199169e+01 -3.21102095e+00 1.79114799e+01]
[-6.69487762e+00 -2.07867098e+00 2.55993977e+01 9.22529984e+00]
[-4.58939791e+00 1.13071709e+01 -8.51708221e+00 1.53944530e+01]
[-8.00637245e+00 -7.89592266e+00 -2.87560921e+01 1.09390984e+01]
[-5.11176968e+00 3.85240173e+01 1.99310017e+01 7.20341921e+00]
[-1.82089520e+01 -1.15782108e+01 2.32500095e+01 1.70795441e+01]
[-5.38826656e+00 -5.16052604e-01 7.55389071e+00 -2.78993368e+00]
[-2.93931408e+01 2.84307137e+01 -8.59254932e+00 6.63110781e+00]
[ 2.08302784e+00 -1.36329422e+01 2.28370094e+00 9.54674125e-01]
[ 1.40419159e+01 2.99034023e+01 2.43818436e+01 -1.16714678e+01]
[-3.78456116e+00 6.34562492e+00 1.30365887e+01 1.35493774e+01]
[ 1.08999281e+01 -9.13232231e+00 1.49827671e+01 4.86897135e+00]
[-4.10065155e+01 -2.02025833e+01 6.21772051e-01 2.32425213e+01]
[-6.52799988e+00 -1.28677311e+01 1.28160191e+01 8.52158546e+00]
[-4.13176012e+00 1.66961842e+01 -2.02388496e+01 3.90056300e+00]
[ 1.85877590e+01 -1.07356920e+01 -3.37900162e+00 -1.21755600e-01]
[-1.21110983e+01 8.66492653e+00 1.38547068e+01 -5.93335199e+00]]
Hey man! Can you tell me I have my own annotation file with the correct format path,xmin,ymin,xmax,ymax,label. How do directly input this
I have about 4500 images (415, 313) of 7 classes. Can anyone give me a strategy to train the RPN model? (The number of batch size, epoch, learning_rate,..).
Thank you so much!
thanks for the detailed tutorial and implementation, however i am stuck at training both the RPN and RCNN at the same time. Can you please post a code or just give me a hint how can i train both the networks at the same time. Also it would be nice if you could upload test file
Thanks in Advance,
Can you show me step by step training model RPN. I run file RPN.py but nothing happen. Thank you so much.
Does increasing the number of scales and ratios improve the performance of RPN model?
Hello Dongjk , I'm try read code RPN again, you can explain to me function produce_batch : x = ind % width, y = int(ind/width) . What is x,y ? Thank you so much .
Thanks for the repository - the most simple and educational implantation I have seen !
I am trying to figure out how you are using the image annotation (labels for each GT) but haven't seen more than
_, gt_boxes, h_w = parse_label(anno_path+line.split()[0]+'.xml')
can you point me to a line / snippet where ROI annotations are used ? Thanks.
1.Need filter for x1 > x2 and y1 > y2
2.Need filter for 0 > x > border or 0 > y > border
3.Bad batch_size implementation cannot cover "empty array found."
batch_inds=inds_inside[labels!=-1]
batch_inds=(batch_inds / k).astype(np.int)
these are all tiles for which few boxes get max overlap
batch_label_targets=full_labels.reshape(-1,1,1,1*k)[batch_inds]
it should be
batch_label_targets=full_labels.reshape(-1,1,1,1*k)[inds_inside[labels!=-1]]
The same way
batch_bbox_targets = bbox_targets.reshape(-1,1,1,4*k)[batch_inds]
it should be
batch_bbox_targets = bbox_targets.reshape(-1,1,1,4*k)[inds_inside[labels!=-1]]
batch_inds
are tile index but we needs to pick the anchor index.
Please let me know if I am missing any thing here.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.