Git Product home page Git Product logo

pytorch_speaker_verification's People

Contributors

harryvolek avatar mazzzystar avatar pranshurastogi29 avatar seandickert avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pytorch_speaker_verification's Issues

Workflow of the project

Can you describe the workflow of the project.
for example.
Step 1. run the preprocess script
Step 2. run the dvector_create script
Step 3. and maybe you can take it from here

I can't load the trained model

@HarryVolek I have get the model of TIMIT database ,and get thi information
"Done, trained model saved at ./speech_id_checkpoint/final_epoch_950_batch_id_141.model"
But when I do test , I cant load the trained model.The log is
Traceback (most recent call last):
File "train_speech_embedder.py", line 168, in
test('speech_id_checkpoint/ckpt_epoch_840_batch_id_141.pth')
File "train_speech_embedder.py", line 109, in test
embedder_net.load_state_dict(torch.load(model_path))
File "/media/diskc/wq/.local/lib/python3.5/site-packages/torch/nn/modules/module.py", line 721, in load_state_dict
self.class.name, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for SpeechEmbedder:
Missing key(s) in state_dict: "LSTM_stack.weight_ih_l1", "LSTM_stack.bias_hh_l1", "LSTM_stack.bias_hh_l0", "LSTM_stack.bias_ih_l0", "LSTM_stack.weight_hh_l2", "LSTM_stack.weight_hh_l1", "LSTM_stack.bias_ih_l1", "LSTM_stack.weight_ih_l0", "LSTM_stack.weight_ih_l2", "LSTM_stack.weight_hh_l0", "LSTM_stack.bias_hh_l2", "LSTM_stack.bias_ih_l2", "projection.weight", "projection.bias".
Unexpected key(s) in state_dict: "module.LSTM_stack.weight_ih_l0", "module.LSTM_stack.weight_hh_l0", "module.LSTM_stack.bias_ih_l0", "module.LSTM_stack.bias_hh_l0", "module.LSTM_stack.weight_ih_l1", "module.LSTM_stack.weight_hh_l1", "module.LSTM_stack.bias_ih_l1", "module.LSTM_stack.bias_hh_l1", "module.LSTM_stack.weight_ih_l2", "module.LSTM_stack.weight_hh_l2", "module.LSTM_stack.bias_ih_l2", "module.LSTM_stack.bias_hh_l2", "module.projection.weight", "module.projection.bias".

poor performance

I am training the network on VCTK corpus (framerate =48kHZ
109 speakers with an average of 300 utterances per speaker)
I got a very high EER ( 0.45) and I couldn't understand why the performance is poor (because of frame rate or there is not enough data or there is a problem in the model )
Any thoughts (for data augmentation I don't think adding noise because the model removes the noise in the data preprocessing )
Your help is much appreciated.
Thank you.

Sliding window implementation for training

I haven't found sliding window implementation for training as described in the Google paper (fig.3). Only first and last 180 frames were taken from each segment. Why don't you implement sliding window? Could you explain the reason?

EER

Sir can you tell me how much you got EER for TIMIT database
you didnt mention in readme

about the sampling rate

how much does the sampling rate affect the inference? Can I train the network at 16khz sampling rate while inference at other sampling rate such as 32khz or 48khz?

How to use dvector_create.py

Hi!

Could you please explain how to run dvector_create.py on the TIMIT dataset?

This program tries to load some .wav files (line 91). However, the original data in TIMIT are .WAV files. After preprocessing, they are converted to .npy files. But where to find .wav files?

Thanks!

confusion about enrollment and inference handling? shape issue...?

#18 (comment)

In train_speech_embedder.py, separated into enrollment and verification.

enrollment_batch, verification_batch = torch.split(mel_db_batch, int(mel_db_batch.size(1)/2), dim=1)

in the same i followed, separated into two functions, like enrollment, verification.
in enrollment,

enrollment_batch, _ = torch.split(mel_db_batch, int(mel_db_batch.size(1)/2), dim=1)
torch.save("./dummy/speaker_embedding.pt", enrollment_centroids)

in verification,

_ , verification_batch = torch.split(mel_db_batch, int(mel_db_batch.size(1)/2), dim=1)
enrollment_centroids = torch.load("./dummy/speaker_embedding.pt")
sim_matrix = get_cossim(verification_embeddings, enrollment_centroids)

is this corrcect procedure or not?

then which way to handle enrollment and inference for speaker identification?

i am beginner in pytorch, any suggestions sir.

Thank you.

is this d-vector embedding input "GE2E Speaker Verification" is moreover same as UIS-RNN or not?

sir i have one doubt.

is this d-vector embedding PyTorch implementation of "Generalized End-to-End Loss for Speaker Verification" is moreover common as UIS-RNN or not?

accuracy for d-vector embeddings?

is it satisfying continuous d-vector embeddings(as sequences) or not?

sir one you saw this url. i did TIMIT dataset to generated the d-vector embeddings. but how to feed or initialize this embeddings i don't know.

google/uis-rnn#6 (comment)

google/uis-rnn#6 (comment)

Some of the libraries are only able to produce per-utterance d-vector embeddings, while for UIS-RNN, we require continuous d-vector embeddings (as sequences). We have no guarantee which third-party library supports this

can you help me sir? thank you

How to Use Your Audio Data Set

Hello, this code is for TIMIT data set. If you change it into your own audio data, it will not work. If you use your own audio data as a data set, how can I do it? Thank you very much for your guidance. #

Speaker Verification Predict

Hi, first of all, thank you for your awesome work. I intend to use this for speaker verification, I know a speaker and I need to know if another audio belongs to him or not. I have some questions regarding your repo:

  • I had thought of saving the speaker embeddings, that is equivalent to the part of the code where you calculate the centroid right? I could calculate the centroid of all my known speakers and just load it later.

  • As I said my intention is to give the network and audio and test it against a known speaker, but I am a bit lost on how could I do that with your code, could you point me in the right direction? I get lost after you calculate the sim_matrix and the FAR FRR and thresh parameters.

  • I think that my main question is how do I fix that threshold parameter and how do I get a binary output meaning if two inputs correspond to the same person or not.

Once more thank you for your work!

lstm input wrong shape

I am not 100% sure but as I understand the input for the lstm layer should be in shape (seq_len, batch, input_size) according to the documentation. The input for the SpeechEmbedder is in the shape of (batch, seq_len, input_size). Is this a mistake or do I understand something wrong?

no of epochs

my training no of speakers is 5000. how many epochs will i set for training. i think 950 is too much na

Pre-trained model

Hi! Do you have any plans on sharing a pre-trained model on TIMIT?

perm action

Hi, sir. I don't think that the perm and unperm acitions in your code make any difference. Because the perm action is along the batch dim, in the forward process, the different data along the batch dim are unrelated. Or I misunderstood your purpose?

how about wav which haven't been trained ?

@HarryVolek
thanks for your project ,I have run it succesfully .
but there is another problem,if there is a person's speech wav that haven't been trained ,can this verify it out?
I mean we haven't trained any speech wav of the new person.

how to finetune TIMIT model with my own speakers audios?

@HarryVolek Sir, i have model for our PyTorch_Speaker_Verification ( epochs: 950 Max training speaker epoch for TIMIT)
sir i have a doubt. In that pytorch model, we will finetune(transfer learning/ Retrain) with my own speakers audios again.
Is it possible or not sir? which way to implement this?

AND another doubt, if we have only 10-20 speakers audio utterence, will it gives good accuracy or not?
(or)
in this 10-20 speakers audio utterence to build own model, will it gives good accuracy or not?

In that Voice Activity Detection how can we achieve "Wake word detection" sir? then how can i do this in dynamic way?

i am more concentrate on this. thank you for your initiative.
thank you lots @HarryVolek sir

GE2E loss can work with cnn

I try to replace your LSTM with CNN but the loss doesn't decrease. The input, loss function are the same. Can you explain why is happening?

OOM

Hi! I try to run this code in Linux on cpu with 16g memory. The cost of memory was growth fast util OS kill the process. It only run about 30 iterations.
Do you have some idea to fix the problem? What should I do to debug?
Thank you!

Question about L2 norm of embedding

Thanks for this package. I was wondering about the L2 norm being applied in speech_embedder_net.py line 32. It looks like the paper describes an L2 norm being calculated per embedding and then dividing the embedding by that value. However, I believe the code as written calculates the norm over the entire matrix. Is this intentional? That is, I would have thought line 32 should be: x = x / torch.norm(x, dim=1).unsqueeze(1). This calculates the norm per embedding.

Issue in Using test_features in UIS_RNN

HI HarryVolek,

I have created embeddings for Timit model and generated the d-vectors as you explained in README but when i try to test the features in USI_RNN it throw the following error

"ValueError: test_sequence must be 2-dim array."
guess we need to reshape the test_sequence

Correctness of d-vectors

How to evaluate the correctness of d-vectors. my d-vectors change when I change th dataset on which I train my LSTM

Overfitting

Hi,

I see the model is overfitting on the mix of data I am using. Can you please suggest some regularisation technique which might help here? Shall I add more data?

Sun Jul 7 10:31:55 2019 Epoch:1700[30/79],Iteration:134251 Loss:0.0008 TLoss:0.9285

Sun Jul 7 10:31:58 2019 Epoch:1700[60/79],Iteration:134281 Loss:0.0011 TLoss:0.6694

Sun Jul 7 10:32:05 2019 Epoch:1701[30/79],Iteration:134330 Loss:0.0631 TLoss:0.4391

Embedder-net()

Hi,I have some questions (this is my graduation project and this is extremly important for me )

-the output of embedder-net() function is a [N,256] I need to understand what is N exactly is it the number of sliding windows (240ms)?
-Can we use this output (embedder-net() function output) for speaker diarization (can we apply clustering algorithms to this sequences for speaker diarization)?
-Can I understand how did you build train-sequence and train-cluster-id (the input of uis rnn ) because my dataset is different from TIMIT-corpus (Timit-corpus is a speaker recognition dataset not a speaker diarization dataset )?
this is a link to the corpus I am using : https://github.com/EMRAI/emrai-synthetic-diarization-corpus
Thank you in advance for your help

Bitrate/Sample Rate Differences

How much of an impact does the sample rate and bitrate have? My use case is that I need to perform speaker verification across multiple sample rates/codecs such as AMR and EVRC. I will of course be converting them to wav for training but since I will be performing the verification across multiple sample rates, I'm wondering if I should adapt my training set to have varying sample rates as well. Thoughts?

Frame rate problem

I am working on VCTK corpus (48kHZ)
I can't train my network and preprocess the data using this frame rate and always I got this error (mainly in preprocess and get_STFT_frames function )
What should I modify in the code ?
thank you for help
Capture
)

Calculating EER

I can't understand well the process of calculating EER .
I would be greatful if someone could explain this to me .

Excessive memory use due to train.num_workers > 0

Hello, thanks for sharing your code.

I've succesfully tried it with TIMIT data. However, I have come across a memory issue when training on a much larger dataset (over 7000 speakers) - after the first epoch, used memory starts rapidly increasing and eventually surpasses available RAM.

From what I've been able to find, this is actually caused by a known problem with PyTorch DataLoader:

1: CPU memory gradually leaks when num_workers > 0 in the DataLoader
2: https://forums.fast.ai/t/runtimeerror-dataloader-worker-is-killed-by-signal/31277

The above links mention some possible workarounds you might be able to use in your code. Barring that, setting train.num_workers to zero (default is 8) solves this problem - possibly at the cost of speed, although in my case the training speed actually improved.

Even if you're not able to fix this completely, I would suggest at least changing the default train.num_workers setting from 8 to 0.

(Note: I suspect this may also be the cause of issue #20)

Question about inference

Hi @HarryVolek,

I trained the model correctly. But now I have some .wav as inputs so how can I use the trained model in order to do inference ?
Also, could the inference be used for speaker identification (verify if the utterance belongs to one of a set of N speakers) or is it just valid for speaker verification (verify if the utterance is for the claimed speaker) ?
Thanks in advance,
Tony,

Question about the pipeline.

Thanks for sharing your great work. As a newer to to speaker verification, I have two question here.
First, what's the purpose of perm and unperm in training script.
Second, since the original data matrix is disordered by the perm index, so how the loss is calculated between different speakers?

Thanks a lot!

feature extraction

sir can you explain how you extracted features for every utterance. and finally for one speaker the features will be (12,40,180) can you explain dimentions.

License

I'm unable to find the license that's associated with this file. Would you mind adding a license? If possible, can you add a permissive license like the MIT or BSD? I'm unable to use any software at my company if it doesn't have an obvious license associated with it.

Thanks!

DataLoader

hello
I can't execute this part of code on Jupyter Notebook (there are no errors but I can't execute it an i works on google colab )
train_dataset = SpeakerDatasetEMRAIreprocessed()
train_loader = DataLoader(train_dataset, batch_size=N, shuffle=True, num_workers=num_workers, drop_last=True)
Thank you for help

Two question on Embeddings

About the align_embeddings

After calculating all the Embeddings window-by-window, you then calculate the average Embedding with align_embeddings function. But why here you hardcode the threshold, and what do they mean ?

for i, embedding in enumerate(embeddings):
if (i*.12)+.24 < j*.401:
end = end + 1
else:
partitions.append((start,end))
start = end
end = end + 1
j += 1

In the original paper, it says "the final utterance-wise d-vector is generated by L2 normalizing the window-wise d-vectors, then taking the element-wise average".

About the embedding frame.

In your implementation of data_preprocess.py, you only save the first N frame and the last N frame as below to .npy file, which may drop a lot of data if current utterance consists long frames:

utterances_spec = []
for utter_name in os.listdir(folder):
if utter_name[-4:] == '.WAV':
utter_path = os.path.join(folder, utter_name) # path of each utterance
utter, sr = librosa.core.load(utter_path, hp.data.sr) # load utterance audio
intervals = librosa.effects.split(utter, top_db=30) # voice activity detection
for interval in intervals:
if (interval[1]-interval[0]) > utter_min_len: # If partial utterance is sufficient long,
utter_part = utter[interval[0]:interval[1]] # save first and last 180 frames of spectrogram.
S = librosa.core.stft(y=utter_part, n_fft=hp.data.nfft,
win_length=int(hp.data.window * sr), hop_length=int(hp.data.hop * sr))
S = np.abs(S) ** 2
mel_basis = librosa.filters.mel(sr=hp.data.sr, n_fft=hp.data.nfft, n_mels=hp.data.nmels)
S = np.log10(np.dot(mel_basis, S) + 1e-6) # log mel spectrogram of utterances
utterances_spec.append(S[:, :hp.data.tisv_frame]) # first 180 frames of partial utterance
utterances_spec.append(S[:, -hp.data.tisv_frame:]) # last 180 frames of partial utterance

Why don't use all the frames to get more data into training ? Like:

utterances_spec = []
j = hp.data.tisv_frame
while j < S.shape[1]:
    utterances_spec.append(S[:, j - hp.data.tisv_frame:j])
    j += hp.data.tisv_frame

=====Update====
As for the second question, I tried and know why, is it because most sentences are no longer than 2 * tisv_frame, so you just crop the head and the tail ?

trained model used in uis-rnn

when I trained a modle with this project ,but when using it in uis rnn, it come out errors:
here the train log.

/home/rice/anaconda3/envs/pytorch/bin/python3.6 /home/rice/PycharmProjects/uis-rnn-master/demo.py
Iter: 0 Training Loss: -292.7123
Negative Log Likelihood: 1.1230 Sigma2 Prior: -293.8359 Regularization: 0.0006
Iter: 10 Training Loss: -292.9026
Negative Log Likelihood: 1.0294 Sigma2 Prior: -293.9326 Regularization: 0.0006
Iter: 20 Training Loss: -293.1618
Negative Log Likelihood: 0.9518 Sigma2 Prior: -294.1143 Regularization: 0.0006
Iter: 30 Training Loss: -293.3418
Negative Log Likelihood: 0.8811 Sigma2 Prior: -294.2235 Regularization: 0.0006
Iter: 40 Training Loss: -293.6046
Negative Log Likelihood: 0.7965 Sigma2 Prior: -294.4017 Regularization: 0.0006
Iter: 50 Training Loss: -293.7784
Negative Log Likelihood: 0.7400 Sigma2 Prior: -294.5190 Regularization: 0.0006
Iter: 60 Training Loss: -294.0261
Negative Log Likelihood: 0.6534 Sigma2 Prior: -294.6801 Regularization: 0.0006
Iter: 70 Training Loss: -294.1154
Negative Log Likelihood: 0.6151 Sigma2 Prior: -294.7312 Regularization: 0.0006
Iter: 80 Training Loss: -294.3496
Negative Log Likelihood: 0.5512 Sigma2 Prior: -294.9014 Regularization: 0.0006
Iter: 90 Training Loss: -294.6035
Negative Log Likelihood: 0.4841 Sigma2 Prior: -295.0882 Regularization: 0.0006
Iter: 100 Training Loss: -294.6783
Negative Log Likelihood: 0.4460 Sigma2 Prior: -295.1250 Regularization: 0.0006
Iter: 110 Training Loss: -294.9049
Negative Log Likelihood: 0.3866 Sigma2 Prior: -295.2922 Regularization: 0.0006
Iter: 120 Training Loss: -295.1215
Negative Log Likelihood: 0.3335 Sigma2 Prior: -295.4556 Regularization: 0.0006
Iter: 130 Training Loss: -295.2138
Negative Log Likelihood: 0.3084 Sigma2 Prior: -295.5229 Regularization: 0.0006
Iter: 140 Training Loss: -295.3630
Negative Log Likelihood: 0.2796 Sigma2 Prior: -295.6432 Regularization: 0.0006
Iter: 150 Training Loss: -295.5717
Negative Log Likelihood: 0.2386 Sigma2 Prior: -295.8109 Regularization: 0.0006
Iter: 160 Training Loss: -295.6212
Negative Log Likelihood: 0.2382 Sigma2 Prior: -295.8600 Regularization: 0.0006
Iter: 170 Training Loss: -295.8949
Negative Log Likelihood: 0.1961 Sigma2 Prior: -296.0916 Regularization: 0.0006
Iter: 180 Training Loss: -296.0236
Negative Log Likelihood: 0.1764 Sigma2 Prior: -296.2007 Regularization: 0.0006
Iter: 190 Training Loss: -296.1873
Negative Log Likelihood: 0.1511 Sigma2 Prior: -296.3391 Regularization: 0.0006
Iter: 200 Training Loss: -296.3225
Negative Log Likelihood: 0.1493 Sigma2 Prior: -296.4724 Regularization: 0.0006
Iter: 210 Training Loss: -296.3659
Negative Log Likelihood: 0.1548 Sigma2 Prior: -296.5213 Regularization: 0.0006
Iter: 220 Training Loss: -296.5490
Negative Log Likelihood: 0.1174 Sigma2 Prior: -296.6670 Regularization: 0.0006
Iter: 230 Training Loss: -296.6978
Negative Log Likelihood: 0.1129 Sigma2 Prior: -296.8114 Regularization: 0.0006
Iter: 240 Training Loss: -296.7265
Negative Log Likelihood: 0.1292 Sigma2 Prior: -296.8564 Regularization: 0.0006
Iter: 250 Training Loss: -296.9639
Negative Log Likelihood: 0.1191 Sigma2 Prior: -297.0836 Regularization: 0.0006
Iter: 260 Training Loss: -297.1142
Negative Log Likelihood: 0.0966 Sigma2 Prior: -297.2114 Regularization: 0.0006
Iter: 270 Training Loss: -297.3285
Negative Log Likelihood: 0.0972 Sigma2 Prior: -297.4264 Regularization: 0.0006
Iter: 280 Training Loss: -297.3492
Negative Log Likelihood: 0.1149 Sigma2 Prior: -297.4648 Regularization: 0.0006
Iter: 290 Training Loss: -297.5171
Negative Log Likelihood: 0.0907 Sigma2 Prior: -297.6085 Regularization: 0.0006
Iter: 300 Training Loss: -297.6031
Negative Log Likelihood: 0.1217 Sigma2 Prior: -297.7255 Regularization: 0.0006
Iter: 310 Training Loss: -297.7408
Negative Log Likelihood: 0.1000 Sigma2 Prior: -297.8414 Regularization: 0.0006
Iter: 320 Training Loss: -297.9149
Negative Log Likelihood: 0.0958 Sigma2 Prior: -298.0113 Regularization: 0.0006
Iter: 330 Training Loss: -298.0205
Negative Log Likelihood: 0.1114 Sigma2 Prior: -298.1325 Regularization: 0.0006
Iter: 340 Training Loss: -298.1181
Negative Log Likelihood: 0.0878 Sigma2 Prior: -298.2065 Regularization: 0.0006
Iter: 350 Training Loss: -298.3074
Negative Log Likelihood: 0.0924 Sigma2 Prior: -298.4005 Regularization: 0.0006
Iter: 360 Training Loss: -298.4767
Negative Log Likelihood: 0.0768 Sigma2 Prior: -298.5542 Regularization: 0.0006
Iter: 370 Training Loss: -298.5929
Negative Log Likelihood: 0.0933 Sigma2 Prior: -298.6869 Regularization: 0.0006
Iter: 380 Training Loss: -298.7109
Negative Log Likelihood: 0.0876 Sigma2 Prior: -298.7992 Regularization: 0.0006
Iter: 390 Training Loss: -298.7951
Negative Log Likelihood: 0.0971 Sigma2 Prior: -298.8928 Regularization: 0.0006
Iter: 400 Training Loss: -298.9722
Negative Log Likelihood: 0.0889 Sigma2 Prior: -299.0617 Regularization: 0.0006
Iter: 410 Training Loss: -299.0759
Negative Log Likelihood: 0.1211 Sigma2 Prior: -299.1976 Regularization: 0.0006
Iter: 420 Training Loss: -299.3126
Negative Log Likelihood: 0.0831 Sigma2 Prior: -299.3963 Regularization: 0.0006
Iter: 430 Training Loss: -299.3480
Negative Log Likelihood: 0.1019 Sigma2 Prior: -299.4505 Regularization: 0.0006
Iter: 440 Training Loss: -299.5879
Negative Log Likelihood: 0.0894 Sigma2 Prior: -299.6780 Regularization: 0.0006
Iter: 450 Training Loss: -299.6454
Negative Log Likelihood: 0.1086 Sigma2 Prior: -299.7546 Regularization: 0.0006
Iter: 460 Training Loss: -299.8495
Negative Log Likelihood: 0.0820 Sigma2 Prior: -299.9322 Regularization: 0.0006
Iter: 470 Training Loss: -299.9776
Negative Log Likelihood: 0.0821 Sigma2 Prior: -300.0603 Regularization: 0.0006
Iter: 480 Training Loss: -300.1432
Negative Log Likelihood: 0.0770 Sigma2 Prior: -300.2209 Regularization: 0.0006
Iter: 490 Training Loss: -300.1850
Negative Log Likelihood: 0.0830 Sigma2 Prior: -300.2686 Regularization: 0.0006
Iter: 500 Training Loss: -300.3543
Negative Log Likelihood: 0.0824 Sigma2 Prior: -300.4373 Regularization: 0.0006
Iter: 510 Training Loss: -300.4816
Negative Log Likelihood: 0.0881 Sigma2 Prior: -300.5703 Regularization: 0.0006
Iter: 520 Training Loss: -300.6026
Negative Log Likelihood: 0.1058 Sigma2 Prior: -300.7090 Regularization: 0.0006
Iter: 530 Training Loss: -300.7959
Negative Log Likelihood: 0.0879 Sigma2 Prior: -300.8845 Regularization: 0.0006
Iter: 540 Training Loss: -300.9551
Negative Log Likelihood: 0.0754 Sigma2 Prior: -301.0312 Regularization: 0.0006
Iter: 550 Training Loss: -301.0412
Negative Log Likelihood: 0.0980 Sigma2 Prior: -301.1399 Regularization: 0.0006
Iter: 560 Training Loss: -301.1181
Negative Log Likelihood: 0.0952 Sigma2 Prior: -301.2140 Regularization: 0.0006
Iter: 570 Training Loss: -301.2436
Negative Log Likelihood: 0.1082 Sigma2 Prior: -301.3524 Regularization: 0.0006
Iter: 580 Training Loss: -301.4844
Negative Log Likelihood: 0.0801 Sigma2 Prior: -301.5652 Regularization: 0.0006
Iter: 590 Training Loss: -301.6274
Negative Log Likelihood: 0.0957 Sigma2 Prior: -301.7237 Regularization: 0.0006
Iter: 600 Training Loss: -301.7388
Negative Log Likelihood: 0.0951 Sigma2 Prior: -301.8346 Regularization: 0.0006
Iter: 610 Training Loss: -301.8443
Negative Log Likelihood: 0.0882 Sigma2 Prior: -301.9331 Regularization: 0.0006
Iter: 620 Training Loss: -301.9659
Negative Log Likelihood: 0.1069 Sigma2 Prior: -302.0734 Regularization: 0.0006
Iter: 630 Training Loss: -302.1609
Negative Log Likelihood: 0.1005 Sigma2 Prior: -302.2620 Regularization: 0.0006
Iter: 640 Training Loss: -302.2143
Negative Log Likelihood: 0.1000 Sigma2 Prior: -302.3149 Regularization: 0.0006
Iter: 650 Training Loss: -302.4611
Negative Log Likelihood: 0.0912 Sigma2 Prior: -302.5529 Regularization: 0.0006
Iter: 660 Training Loss: -302.5591
Negative Log Likelihood: 0.0919 Sigma2 Prior: -302.6517 Regularization: 0.0006
Iter: 670 Training Loss: -302.6293
Negative Log Likelihood: 0.1121 Sigma2 Prior: -302.7421 Regularization: 0.0006
Iter: 680 Training Loss: -302.7584
Negative Log Likelihood: 0.1017 Sigma2 Prior: -302.8607 Regularization: 0.0006
Iter: 690 Training Loss: -303.0341
Negative Log Likelihood: 0.0645 Sigma2 Prior: -303.0992 Regularization: 0.0006
Iter: 700 Training Loss: -303.0613
Negative Log Likelihood: 0.0991 Sigma2 Prior: -303.1611 Regularization: 0.0006
Iter: 710 Training Loss: -303.0722
Negative Log Likelihood: 0.1232 Sigma2 Prior: -303.1961 Regularization: 0.0006
Iter: 720 Training Loss: -303.3741
Negative Log Likelihood: 0.0947 Sigma2 Prior: -303.4695 Regularization: 0.0006
Iter: 730 Training Loss: -303.5182
Negative Log Likelihood: 0.1053 Sigma2 Prior: -303.6241 Regularization: 0.0006
Iter: 740 Training Loss: -303.7680
Negative Log Likelihood: 0.0731 Sigma2 Prior: -303.8417 Regularization: 0.0006
Iter: 750 Training Loss: -303.8307
Negative Log Likelihood: 0.0740 Sigma2 Prior: -303.9053 Regularization: 0.0006
Iter: 760 Training Loss: -303.8647
Negative Log Likelihood: 0.1077 Sigma2 Prior: -303.9730 Regularization: 0.0006
Iter: 770 Training Loss: -304.1971
Negative Log Likelihood: 0.0760 Sigma2 Prior: -304.2737 Regularization: 0.0006
Iter: 780 Training Loss: -304.2335
Negative Log Likelihood: 0.0956 Sigma2 Prior: -304.3297 Regularization: 0.0006
Iter: 790 Training Loss: -304.4388
Negative Log Likelihood: 0.0917 Sigma2 Prior: -304.5311 Regularization: 0.0006
Iter: 800 Training Loss: -304.6026
Negative Log Likelihood: 0.0776 Sigma2 Prior: -304.6808 Regularization: 0.0006
Iter: 810 Training Loss: -304.6454
Negative Log Likelihood: 0.1080 Sigma2 Prior: -304.7541 Regularization: 0.0006
Iter: 820 Training Loss: -304.7875
Negative Log Likelihood: 0.0952 Sigma2 Prior: -304.8833 Regularization: 0.0006
Iter: 830 Training Loss: -304.9782
Negative Log Likelihood: 0.0935 Sigma2 Prior: -305.0724 Regularization: 0.0006
Iter: 840 Training Loss: -305.1052
Negative Log Likelihood: 0.1015 Sigma2 Prior: -305.2074 Regularization: 0.0006
Iter: 850 Training Loss: -305.1952
Negative Log Likelihood: 0.1143 Sigma2 Prior: -305.3101 Regularization: 0.0006
Iter: 860 Training Loss: -305.3628
Negative Log Likelihood: 0.0894 Sigma2 Prior: -305.4528 Regularization: 0.0006
Iter: 870 Training Loss: -305.5805
Negative Log Likelihood: 0.0891 Sigma2 Prior: -305.6702 Regularization: 0.0006
Iter: 880 Training Loss: -305.6956
Negative Log Likelihood: 0.0992 Sigma2 Prior: -305.7954 Regularization: 0.0006
Iter: 890 Training Loss: -305.7558
Negative Log Likelihood: 0.0891 Sigma2 Prior: -305.8456 Regularization: 0.0006
Iter: 900 Training Loss: -306.0251
Negative Log Likelihood: 0.0935 Sigma2 Prior: -306.1192 Regularization: 0.0006
Iter: 910 Training Loss: -306.2444
Negative Log Likelihood: 0.0719 Sigma2 Prior: -306.3170 Regularization: 0.0006
Iter: 920 Training Loss: -306.2163
Negative Log Likelihood: 0.1155 Sigma2 Prior: -306.3325 Regularization: 0.0006
Iter: 930 Training Loss: -306.4367
Negative Log Likelihood: 0.0923 Sigma2 Prior: -306.5296 Regularization: 0.0006
Iter: 940 Training Loss: -306.5031
Negative Log Likelihood: 0.1071 Sigma2 Prior: -306.6108 Regularization: 0.0006
Iter: 950 Training Loss: -306.7175
Negative Log Likelihood: 0.0770 Sigma2 Prior: -306.7951 Regularization: 0.0006
Iter: 960 Training Loss: -306.8451
Negative Log Likelihood: 0.0951 Sigma2 Prior: -306.9408 Regularization: 0.0006
Iter: 970 Training Loss: -307.0551
Negative Log Likelihood: 0.0845 Sigma2 Prior: -307.1403 Regularization: 0.0006
Iter: 980 Training Loss: -307.1711
Negative Log Likelihood: 0.0907 Sigma2 Prior: -307.2625 Regularization: 0.0006
Iter: 990 Training Loss: -307.4298
Negative Log Likelihood: 0.0719 Sigma2 Prior: -307.5023 Regularization: 0.0006
Iter: 1000 Training Loss: -307.4031
Negative Log Likelihood: 0.1012 Sigma2 Prior: -307.5049 Regularization: 0.0006
Iter: 1010 Training Loss: -307.5197
Negative Log Likelihood: 0.0959 Sigma2 Prior: -307.6162 Regularization: 0.0006
Iter: 1020 Training Loss: -307.6790
Negative Log Likelihood: 0.0943 Sigma2 Prior: -307.7739 Regularization: 0.0006
Iter: 1030 Training Loss: -307.9036
Negative Log Likelihood: 0.0844 Sigma2 Prior: -307.9886 Regularization: 0.0006
Iter: 1040 Training Loss: -308.0685
Negative Log Likelihood: 0.0747 Sigma2 Prior: -308.1438 Regularization: 0.0006
Iter: 1050 Training Loss: -308.1566
Negative Log Likelihood: 0.0862 Sigma2 Prior: -308.2434 Regularization: 0.0006
Iter: 1060 Training Loss: -308.2401
Negative Log Likelihood: 0.0975 Sigma2 Prior: -308.3383 Regularization: 0.0006
Iter: 1070 Training Loss: -308.5387
Negative Log Likelihood: 0.0815 Sigma2 Prior: -308.6208 Regularization: 0.0006
Iter: 1080 Training Loss: -308.6100
Negative Log Likelihood: 0.0949 Sigma2 Prior: -308.7055 Regularization: 0.0006
Iter: 1090 Training Loss: -308.7465
Negative Log Likelihood: 0.0960 Sigma2 Prior: -308.8431 Regularization: 0.0006
Iter: 1100 Training Loss: -308.9292
Negative Log Likelihood: 0.0917 Sigma2 Prior: -309.0215 Regularization: 0.0006
Iter: 1110 Training Loss: -309.0396
Negative Log Likelihood: 0.0860 Sigma2 Prior: -309.1262 Regularization: 0.0006
Iter: 1120 Training Loss: -309.3109
Negative Log Likelihood: 0.0750 Sigma2 Prior: -309.3864 Regularization: 0.0006
Iter: 1130 Training Loss: -309.4098
Negative Log Likelihood: 0.0917 Sigma2 Prior: -309.5021 Regularization: 0.0006
Iter: 1140 Training Loss: -309.4720
Negative Log Likelihood: 0.0865 Sigma2 Prior: -309.5591 Regularization: 0.0006
Iter: 1150 Training Loss: -309.7521
Negative Log Likelihood: 0.0706 Sigma2 Prior: -309.8234 Regularization: 0.0006
Iter: 1160 Training Loss: -309.7961
Negative Log Likelihood: 0.0928 Sigma2 Prior: -309.8894 Regularization: 0.0006
Iter: 1170 Training Loss: -309.9894
Negative Log Likelihood: 0.0826 Sigma2 Prior: -310.0726 Regularization: 0.0006
Iter: 1180 Training Loss: -310.0692
Negative Log Likelihood: 0.1070 Sigma2 Prior: -310.1768 Regularization: 0.0006
Iter: 1190 Training Loss: -310.3997
Negative Log Likelihood: 0.0755 Sigma2 Prior: -310.4757 Regularization: 0.0006
Iter: 1200 Training Loss: -310.4816
Negative Log Likelihood: 0.0968 Sigma2 Prior: -310.5789 Regularization: 0.0006
Iter: 1210 Training Loss: -310.6189
Negative Log Likelihood: 0.0969 Sigma2 Prior: -310.7164 Regularization: 0.0006
Iter: 1220 Training Loss: -310.6923
Negative Log Likelihood: 0.0893 Sigma2 Prior: -310.7822 Regularization: 0.0006
Iter: 1230 Training Loss: -310.8269
Negative Log Likelihood: 0.0824 Sigma2 Prior: -310.9099 Regularization: 0.0006
Iter: 1240 Training Loss: -311.0138
Negative Log Likelihood: 0.0905 Sigma2 Prior: -311.1049 Regularization: 0.0006
Iter: 1250 Training Loss: -311.1530
Negative Log Likelihood: 0.0927 Sigma2 Prior: -311.2463 Regularization: 0.0006
Iter: 1260 Training Loss: -311.3544
Negative Log Likelihood: 0.0765 Sigma2 Prior: -311.4315 Regularization: 0.0006
Iter: 1270 Training Loss: -311.5592
Negative Log Likelihood: 0.0802 Sigma2 Prior: -311.6401 Regularization: 0.0006
Iter: 1280 Training Loss: -311.6955
Negative Log Likelihood: 0.0762 Sigma2 Prior: -311.7723 Regularization: 0.0006
Iter: 1290 Training Loss: -311.8909
Negative Log Likelihood: 0.0761 Sigma2 Prior: -311.9677 Regularization: 0.0006
Iter: 1300 Training Loss: -312.0223
Negative Log Likelihood: 0.0875 Sigma2 Prior: -312.1105 Regularization: 0.0006
Iter: 1310 Training Loss: -312.2397
Negative Log Likelihood: 0.0740 Sigma2 Prior: -312.3144 Regularization: 0.0006
Iter: 1320 Training Loss: -312.3552
Negative Log Likelihood: 0.0907 Sigma2 Prior: -312.4465 Regularization: 0.0006
Iter: 1330 Training Loss: -312.4939
Negative Log Likelihood: 0.0818 Sigma2 Prior: -312.5763 Regularization: 0.0006
Iter: 1340 Training Loss: -312.6237
Negative Log Likelihood: 0.0728 Sigma2 Prior: -312.6971 Regularization: 0.0006
Iter: 1350 Training Loss: -312.7400
Negative Log Likelihood: 0.0793 Sigma2 Prior: -312.8199 Regularization: 0.0006
Iter: 1360 Training Loss: -312.8621
Negative Log Likelihood: 0.0921 Sigma2 Prior: -312.9547 Regularization: 0.0006
Iter: 1370 Training Loss: -313.1195
Negative Log Likelihood: 0.0864 Sigma2 Prior: -313.2066 Regularization: 0.0006
Iter: 1380 Training Loss: -313.1552
Negative Log Likelihood: 0.0893 Sigma2 Prior: -313.2451 Regularization: 0.0006
Iter: 1390 Training Loss: -313.3605
Negative Log Likelihood: 0.0802 Sigma2 Prior: -313.4413 Regularization: 0.0006
Iter: 1400 Training Loss: -313.4704
Negative Log Likelihood: 0.0871 Sigma2 Prior: -313.5581 Regularization: 0.0006
Iter: 1410 Training Loss: -313.7135
Negative Log Likelihood: 0.0714 Sigma2 Prior: -313.7855 Regularization: 0.0006
Iter: 1420 Training Loss: -313.7669
Negative Log Likelihood: 0.0914 Sigma2 Prior: -313.8589 Regularization: 0.0006
Iter: 1430 Training Loss: -314.0649
Negative Log Likelihood: 0.0783 Sigma2 Prior: -314.1439 Regularization: 0.0006
Iter: 1440 Training Loss: -314.1483
Negative Log Likelihood: 0.0848 Sigma2 Prior: -314.2337 Regularization: 0.0006
Iter: 1450 Training Loss: -314.2434
Negative Log Likelihood: 0.0931 Sigma2 Prior: -314.3372 Regularization: 0.0006
Iter: 1460 Training Loss: -314.4440
Negative Log Likelihood: 0.0908 Sigma2 Prior: -314.5354 Regularization: 0.0006
Iter: 1470 Training Loss: -314.6235
Negative Log Likelihood: 0.0871 Sigma2 Prior: -314.7112 Regularization: 0.0006
Iter: 1480 Training Loss: -314.6680
Negative Log Likelihood: 0.0953 Sigma2 Prior: -314.7639 Regularization: 0.0006
Iter: 1490 Training Loss: -314.9998
Negative Log Likelihood: 0.0742 Sigma2 Prior: -315.0746 Regularization: 0.0006
Iter: 1500 Training Loss: -315.0698
Negative Log Likelihood: 0.0980 Sigma2 Prior: -315.1684 Regularization: 0.0006
Iter: 1510 Training Loss: -315.2988
Negative Log Likelihood: 0.0781 Sigma2 Prior: -315.3775 Regularization: 0.0006
Iter: 1520 Training Loss: -315.3861
Negative Log Likelihood: 0.0859 Sigma2 Prior: -315.4726 Regularization: 0.0006
Iter: 1530 Training Loss: -315.6111
Negative Log Likelihood: 0.0856 Sigma2 Prior: -315.6974 Regularization: 0.0006
Iter: 1540 Training Loss: -315.6920
Negative Log Likelihood: 0.0893 Sigma2 Prior: -315.7820 Regularization: 0.0006
Iter: 1550 Training Loss: -315.8524
Negative Log Likelihood: 0.0790 Sigma2 Prior: -315.9320 Regularization: 0.0006
Iter: 1560 Training Loss: -315.9643
Negative Log Likelihood: 0.0993 Sigma2 Prior: -316.0642 Regularization: 0.0006
Iter: 1570 Training Loss: -316.1303
Negative Log Likelihood: 0.0984 Sigma2 Prior: -316.2293 Regularization: 0.0006
Iter: 1580 Training Loss: -316.3741
Negative Log Likelihood: 0.0810 Sigma2 Prior: -316.4557 Regularization: 0.0006
Iter: 1590 Training Loss: -316.5742
Negative Log Likelihood: 0.0917 Sigma2 Prior: -316.6665 Regularization: 0.0006
Iter: 1600 Training Loss: -316.8289
Negative Log Likelihood: 0.0718 Sigma2 Prior: -316.9013 Regularization: 0.0006
Iter: 1610 Training Loss: -316.9655
Negative Log Likelihood: 0.0729 Sigma2 Prior: -317.0391 Regularization: 0.0006
Iter: 1620 Training Loss: -317.0010
Negative Log Likelihood: 0.0904 Sigma2 Prior: -317.0920 Regularization: 0.0006
Iter: 1630 Training Loss: -317.1661
Negative Log Likelihood: 0.0817 Sigma2 Prior: -317.2484 Regularization: 0.0006
Iter: 1640 Training Loss: -317.2943
Negative Log Likelihood: 0.0877 Sigma2 Prior: -317.3825 Regularization: 0.0006
Iter: 1650 Training Loss: -317.5265
Negative Log Likelihood: 0.0730 Sigma2 Prior: -317.6000 Regularization: 0.0006
Iter: 1660 Training Loss: -317.6992
Negative Log Likelihood: 0.0947 Sigma2 Prior: -317.7945 Regularization: 0.0006
Iter: 1670 Training Loss: -317.8151
Negative Log Likelihood: 0.0806 Sigma2 Prior: -317.8964 Regularization: 0.0006
Iter: 1680 Training Loss: -318.1438
Negative Log Likelihood: 0.0677 Sigma2 Prior: -318.2121 Regularization: 0.0006
Iter: 1690 Training Loss: -318.1888
Negative Log Likelihood: 0.0877 Sigma2 Prior: -318.2771 Regularization: 0.0006
Iter: 1700 Training Loss: -318.3336
Negative Log Likelihood: 0.0845 Sigma2 Prior: -318.4187 Regularization: 0.0006
Iter: 1710 Training Loss: -318.4632
Negative Log Likelihood: 0.0887 Sigma2 Prior: -318.5524 Regularization: 0.0006
Iter: 1720 Training Loss: -318.6331
Negative Log Likelihood: 0.0939 Sigma2 Prior: -318.7276 Regularization: 0.0006
Iter: 1730 Training Loss: -318.8208
Negative Log Likelihood: 0.0768 Sigma2 Prior: -318.8982 Regularization: 0.0006
Iter: 1740 Training Loss: -319.1058
Negative Log Likelihood: 0.0774 Sigma2 Prior: -319.1838 Regularization: 0.0006
Iter: 1750 Training Loss: -319.2065
Negative Log Likelihood: 0.0801 Sigma2 Prior: -319.2872 Regularization: 0.0006
Iter: 1760 Training Loss: -319.2932
Negative Log Likelihood: 0.0890 Sigma2 Prior: -319.3828 Regularization: 0.0006
Iter: 1770 Training Loss: -319.5010
Negative Log Likelihood: 0.0777 Sigma2 Prior: -319.5793 Regularization: 0.0006
Iter: 1780 Training Loss: -319.7315
Negative Log Likelihood: 0.0709 Sigma2 Prior: -319.8031 Regularization: 0.0006
Iter: 1790 Training Loss: -319.8305
Negative Log Likelihood: 0.0756 Sigma2 Prior: -319.9067 Regularization: 0.0006
Iter: 1800 Training Loss: -320.0067
Negative Log Likelihood: 0.0825 Sigma2 Prior: -320.0898 Regularization: 0.0006
Iter: 1810 Training Loss: -320.3524
Negative Log Likelihood: 0.0692 Sigma2 Prior: -320.4222 Regularization: 0.0006
Iter: 1820 Training Loss: -320.4152
Negative Log Likelihood: 0.0821 Sigma2 Prior: -320.4979 Regularization: 0.0006
Iter: 1830 Training Loss: -320.4802
Negative Log Likelihood: 0.0685 Sigma2 Prior: -320.5493 Regularization: 0.0006
Iter: 1840 Training Loss: -320.5881
Negative Log Likelihood: 0.0946 Sigma2 Prior: -320.6833 Regularization: 0.0006
Iter: 1850 Training Loss: -320.7672
Negative Log Likelihood: 0.0966 Sigma2 Prior: -320.8644 Regularization: 0.0006
Iter: 1860 Training Loss: -320.9552
Negative Log Likelihood: 0.0806 Sigma2 Prior: -321.0365 Regularization: 0.0006
Iter: 1870 Training Loss: -321.2862
Negative Log Likelihood: 0.0657 Sigma2 Prior: -321.3525 Regularization: 0.0006
Iter: 1880 Training Loss: -321.3665
Negative Log Likelihood: 0.0894 Sigma2 Prior: -321.4566 Regularization: 0.0006
Iter: 1890 Training Loss: -321.5111
Negative Log Likelihood: 0.0820 Sigma2 Prior: -321.5937 Regularization: 0.0006
Iter: 1900 Training Loss: -321.7009
Negative Log Likelihood: 0.0721 Sigma2 Prior: -321.7736 Regularization: 0.0006
Iter: 1910 Training Loss: -321.9050
Negative Log Likelihood: 0.0645 Sigma2 Prior: -321.9702 Regularization: 0.0006
Iter: 1920 Training Loss: -321.9774
Negative Log Likelihood: 0.0841 Sigma2 Prior: -322.0620 Regularization: 0.0006
Iter: 1930 Training Loss: -322.1822
Negative Log Likelihood: 0.0787 Sigma2 Prior: -322.2614 Regularization: 0.0006
Iter: 1940 Training Loss: -322.4178
Negative Log Likelihood: 0.0788 Sigma2 Prior: -322.4972 Regularization: 0.0006
Iter: 1950 Training Loss: -322.5313
Negative Log Likelihood: 0.0877 Sigma2 Prior: -322.6197 Regularization: 0.0006
Iter: 1960 Training Loss: -322.7161
Negative Log Likelihood: 0.0698 Sigma2 Prior: -322.7865 Regularization: 0.0006
Iter: 1970 Training Loss: -322.8975
Negative Log Likelihood: 0.0782 Sigma2 Prior: -322.9763 Regularization: 0.0006
Iter: 1980 Training Loss: -323.1339
Negative Log Likelihood: 0.0658 Sigma2 Prior: -323.2003 Regularization: 0.0006
Iter: 1990 Training Loss: -323.1126
Negative Log Likelihood: 0.0898 Sigma2 Prior: -323.2030 Regularization: 0.0006
Iter: 2000 Training Loss: -323.4022
Negative Log Likelihood: 0.0901 Sigma2 Prior: -323.4930 Regularization: 0.0006
Iter: 2010 Training Loss: -323.6802
Negative Log Likelihood: 0.0834 Sigma2 Prior: -323.7642 Regularization: 0.0006
Iter: 2020 Training Loss: -323.7456
Negative Log Likelihood: 0.0852 Sigma2 Prior: -323.8315 Regularization: 0.0006
Iter: 2030 Training Loss: -323.9381
Negative Log Likelihood: 0.0843 Sigma2 Prior: -324.0230 Regularization: 0.0006
Iter: 2040 Training Loss: -324.1352
Negative Log Likelihood: 0.0787 Sigma2 Prior: -324.2144 Regularization: 0.0006
Iter: 2050 Training Loss: -324.3423
Negative Log Likelihood: 0.0821 Sigma2 Prior: -324.4250 Regularization: 0.0006
Iter: 2060 Training Loss: -324.5129
Negative Log Likelihood: 0.0617 Sigma2 Prior: -324.5752 Regularization: 0.0006
Iter: 2070 Training Loss: -324.6610
Negative Log Likelihood: 0.0682 Sigma2 Prior: -324.7298 Regularization: 0.0006
Iter: 2080 Training Loss: -324.7672
Negative Log Likelihood: 0.0804 Sigma2 Prior: -324.8482 Regularization: 0.0006
Iter: 2090 Training Loss: -325.0027
Negative Log Likelihood: 0.0710 Sigma2 Prior: -325.0743 Regularization: 0.0006
Iter: 2100 Training Loss: -325.1182
Negative Log Likelihood: 0.0879 Sigma2 Prior: -325.2067 Regularization: 0.0006
Iter: 2110 Training Loss: -325.3903
Negative Log Likelihood: 0.0699 Sigma2 Prior: -325.4608 Regularization: 0.0006
Iter: 2120 Training Loss: -325.5568
Negative Log Likelihood: 0.0709 Sigma2 Prior: -325.6284 Regularization: 0.0006
Iter: 2130 Training Loss: -325.5882
Negative Log Likelihood: 0.0863 Sigma2 Prior: -325.6752 Regularization: 0.0006
Iter: 2140 Training Loss: -325.9692
Negative Log Likelihood: 0.0781 Sigma2 Prior: -326.0479 Regularization: 0.0006
Iter: 2150 Training Loss: -326.1191
Negative Log Likelihood: 0.0660 Sigma2 Prior: -326.1857 Regularization: 0.0006
Iter: 2160 Training Loss: -326.1625
Negative Log Likelihood: 0.0876 Sigma2 Prior: -326.2507 Regularization: 0.0006
Iter: 2170 Training Loss: -326.4430
Negative Log Likelihood: 0.0771 Sigma2 Prior: -326.5207 Regularization: 0.0006
Iter: 2180 Training Loss: -326.6283
Negative Log Likelihood: 0.0662 Sigma2 Prior: -326.6951 Regularization: 0.0006
Iter: 2190 Training Loss: -326.7679
Negative Log Likelihood: 0.0798 Sigma2 Prior: -326.8484 Regularization: 0.0006
Iter: 2200 Training Loss: -326.9379
Negative Log Likelihood: 0.0862 Sigma2 Prior: -327.0247 Regularization: 0.0006
Iter: 2210 Training Loss: -327.2742
Negative Log Likelihood: 0.0752 Sigma2 Prior: -327.3500 Regularization: 0.0006
Iter: 2220 Training Loss: -327.2149
Negative Log Likelihood: 0.0867 Sigma2 Prior: -327.3022 Regularization: 0.0006
Iter: 2230 Training Loss: -327.5516
Negative Log Likelihood: 0.0689 Sigma2 Prior: -327.6212 Regularization: 0.0006
Iter: 2240 Training Loss: -327.6122
Negative Log Likelihood: 0.0956 Sigma2 Prior: -327.7085 Regularization: 0.0006
Iter: 2250 Training Loss: -327.7818
Negative Log Likelihood: 0.0752 Sigma2 Prior: -327.8577 Regularization: 0.0006
Iter: 2260 Training Loss: -327.9840
Negative Log Likelihood: 0.0971 Sigma2 Prior: -328.0817 Regularization: 0.0006
Iter: 2270 Training Loss: -328.2056
Negative Log Likelihood: 0.0767 Sigma2 Prior: -328.2829 Regularization: 0.0006
Iter: 2280 Training Loss: -328.3607
Negative Log Likelihood: 0.0831 Sigma2 Prior: -328.4445 Regularization: 0.0006
Iter: 2290 Training Loss: -328.6476
Negative Log Likelihood: 0.0725 Sigma2 Prior: -328.7207 Regularization: 0.0006
Iter: 2300 Training Loss: -328.6038
Negative Log Likelihood: 0.0941 Sigma2 Prior: -328.6985 Regularization: 0.0006
Iter: 2310 Training Loss: -328.9520
Negative Log Likelihood: 0.0799 Sigma2 Prior: -329.0326 Regularization: 0.0006
Iter: 2320 Training Loss: -329.0786
Negative Log Likelihood: 0.0795 Sigma2 Prior: -329.1587 Regularization: 0.0006
Iter: 2330 Training Loss: -329.2320
Negative Log Likelihood: 0.1045 Sigma2 Prior: -329.3370 Regularization: 0.0006
Iter: 2340 Training Loss: -329.4517
Negative Log Likelihood: 0.0745 Sigma2 Prior: -329.5269 Regularization: 0.0006
Iter: 2350 Training Loss: -329.7194
Negative Log Likelihood: 0.0758 Sigma2 Prior: -329.7958 Regularization: 0.0006
Iter: 2360 Training Loss: -329.8744
Negative Log Likelihood: 0.0809 Sigma2 Prior: -329.9559 Regularization: 0.0006
Iter: 2370 Training Loss: -329.9972
Negative Log Likelihood: 0.0732 Sigma2 Prior: -330.0710 Regularization: 0.0006
Iter: 2380 Training Loss: -330.2245
Negative Log Likelihood: 0.0648 Sigma2 Prior: -330.2899 Regularization: 0.0006
Iter: 2390 Training Loss: -330.4160
Negative Log Likelihood: 0.0775 Sigma2 Prior: -330.4941 Regularization: 0.0006
Iter: 2400 Training Loss: -330.5233
Negative Log Likelihood: 0.0823 Sigma2 Prior: -330.6061 Regularization: 0.0006
Iter: 2410 Training Loss: -330.8413
Negative Log Likelihood: 0.0700 Sigma2 Prior: -330.9119 Regularization: 0.0006
Iter: 2420 Training Loss: -330.9188
Negative Log Likelihood: 0.0762 Sigma2 Prior: -330.9956 Regularization: 0.0006
Iter: 2430 Training Loss: -331.0818
Negative Log Likelihood: 0.0920 Sigma2 Prior: -331.1744 Regularization: 0.0006
Iter: 2440 Training Loss: -331.3758
Negative Log Likelihood: 0.0801 Sigma2 Prior: -331.4566 Regularization: 0.0006
Iter: 2450 Training Loss: -331.5941
Negative Log Likelihood: 0.0691 Sigma2 Prior: -331.6638 Regularization: 0.0006
Iter: 2460 Training Loss: -331.7129
Negative Log Likelihood: 0.0901 Sigma2 Prior: -331.8036 Regularization: 0.0006
Iter: 2470 Training Loss: -331.9226
Negative Log Likelihood: 0.0726 Sigma2 Prior: -331.9958 Regularization: 0.0006
Iter: 2480 Training Loss: -332.0706
Negative Log Likelihood: 0.0861 Sigma2 Prior: -332.1573 Regularization: 0.0006
Iter: 2490 Training Loss: -332.2417
Negative Log Likelihood: 0.0765 Sigma2 Prior: -332.3189 Regularization: 0.0006
Iter: 2500 Training Loss: -332.3550
Negative Log Likelihood: 0.0742 Sigma2 Prior: -332.4298 Regularization: 0.0006
Iter: 2510 Training Loss: -332.7703
Negative Log Likelihood: 0.0749 Sigma2 Prior: -332.8458 Regularization: 0.0006
Iter: 2520 Training Loss: -332.8179
Negative Log Likelihood: 0.0901 Sigma2 Prior: -332.9087 Regularization: 0.0006
Iter: 2530 Training Loss: -333.1214
Negative Log Likelihood: 0.0770 Sigma2 Prior: -333.1990 Regularization: 0.0006
Iter: 2540 Training Loss: -333.2626
Negative Log Likelihood: 0.0656 Sigma2 Prior: -333.3288 Regularization: 0.0006
Iter: 2550 Training Loss: -333.4144
Negative Log Likelihood: 0.0927 Sigma2 Prior: -333.5077 Regularization: 0.0006
Iter: 2560 Training Loss: -333.6428
Negative Log Likelihood: 0.0778 Sigma2 Prior: -333.7212 Regularization: 0.0006
Iter: 2570 Training Loss: -333.7151
Negative Log Likelihood: 0.0625 Sigma2 Prior: -333.7782 Regularization: 0.0006
Iter: 2580 Training Loss: -333.8748
Negative Log Likelihood: 0.0917 Sigma2 Prior: -333.9671 Regularization: 0.0006
Iter: 2590 Training Loss: -334.0677
Negative Log Likelihood: 0.0879 Sigma2 Prior: -334.1562 Regularization: 0.0006
Iter: 2600 Training Loss: -334.2982
Negative Log Likelihood: 0.0801 Sigma2 Prior: -334.3789 Regularization: 0.0006
Iter: 2610 Training Loss: -334.5272
Negative Log Likelihood: 0.0906 Sigma2 Prior: -334.6183 Regularization: 0.0006
Iter: 2620 Training Loss: -334.7893
Negative Log Likelihood: 0.0800 Sigma2 Prior: -334.8699 Regularization: 0.0006
Iter: 2630 Training Loss: -334.9515
Negative Log Likelihood: 0.0718 Sigma2 Prior: -335.0240 Regularization: 0.0006
Iter: 2640 Training Loss: -335.2256
Negative Log Likelihood: 0.0714 Sigma2 Prior: -335.2976 Regularization: 0.0006
Iter: 2650 Training Loss: -335.3131
Negative Log Likelihood: 0.0717 Sigma2 Prior: -335.3854 Regularization: 0.0006
Iter: 2660 Training Loss: -335.5082
Negative Log Likelihood: 0.0739 Sigma2 Prior: -335.5827 Regularization: 0.0006
Iter: 2670 Training Loss: -335.5495
Negative Log Likelihood: 0.0840 Sigma2 Prior: -335.6341 Regularization: 0.0006
Iter: 2680 Training Loss: -335.8272
Negative Log Likelihood: 0.0835 Sigma2 Prior: -335.9113 Regularization: 0.0006
Iter: 2690 Training Loss: -336.0701
Negative Log Likelihood: 0.0790 Sigma2 Prior: -336.1497 Regularization: 0.0006
Iter: 2700 Training Loss: -336.3017
Negative Log Likelihood: 0.0745 Sigma2 Prior: -336.3767 Regularization: 0.0006
Iter: 2710 Training Loss: -336.5522
Negative Log Likelihood: 0.0656 Sigma2 Prior: -336.6184 Regularization: 0.0006
Iter: 2720 Training Loss: -336.6366
Negative Log Likelihood: 0.0880 Sigma2 Prior: -336.7252 Regularization: 0.0006
Iter: 2730 Training Loss: -336.8989
Negative Log Likelihood: 0.0728 Sigma2 Prior: -336.9724 Regularization: 0.0006
Iter: 2740 Training Loss: -337.0591
Negative Log Likelihood: 0.0802 Sigma2 Prior: -337.1398 Regularization: 0.0006
Iter: 2750 Training Loss: -337.1747
Negative Log Likelihood: 0.0863 Sigma2 Prior: -337.2616 Regularization: 0.0006
Iter: 2760 Training Loss: -337.4675
Negative Log Likelihood: 0.0863 Sigma2 Prior: -337.5544 Regularization: 0.0006
Iter: 2770 Training Loss: -337.6685
Negative Log Likelihood: 0.0777 Sigma2 Prior: -337.7467 Regularization: 0.0006
Iter: 2780 Training Loss: -337.8244
Negative Log Likelihood: 0.0954 Sigma2 Prior: -337.9204 Regularization: 0.0006
Iter: 2790 Training Loss: -338.1091
Negative Log Likelihood: 0.0801 Sigma2 Prior: -338.1897 Regularization: 0.0006
Iter: 2800 Training Loss: -338.2029
Negative Log Likelihood: 0.0885 Sigma2 Prior: -338.2920 Regularization: 0.0006
Iter: 2810 Training Loss: -338.5401
Negative Log Likelihood: 0.0701 Sigma2 Prior: -338.6108 Regularization: 0.0006
Iter: 2820 Training Loss: -338.6351
Negative Log Likelihood: 0.0885 Sigma2 Prior: -338.7242 Regularization: 0.0006
Iter: 2830 Training Loss: -338.8961
Negative Log Likelihood: 0.0667 Sigma2 Prior: -338.9634 Regularization: 0.0006
Iter: 2840 Training Loss: -339.1290
Negative Log Likelihood: 0.0764 Sigma2 Prior: -339.2059 Regularization: 0.0006
Iter: 2850 Training Loss: -339.1037
Negative Log Likelihood: 0.0753 Sigma2 Prior: -339.1796 Regularization: 0.0006
Iter: 2860 Training Loss: -339.3630
Negative Log Likelihood: 0.0882 Sigma2 Prior: -339.4518 Regularization: 0.0006
Iter: 2870 Training Loss: -339.6915
Negative Log Likelihood: 0.0728 Sigma2 Prior: -339.7649 Regularization: 0.0006
Iter: 2880 Training Loss: -339.8223
Negative Log Likelihood: 0.0783 Sigma2 Prior: -339.9012 Regularization: 0.0006
Iter: 2890 Training Loss: -340.1697
Negative Log Likelihood: 0.0731 Sigma2 Prior: -340.2433 Regularization: 0.0006
Iter: 2900 Training Loss: -340.2490
Negative Log Likelihood: 0.0722 Sigma2 Prior: -340.3218 Regularization: 0.0006
Iter: 2910 Training Loss: -340.4374
Negative Log Likelihood: 0.0770 Sigma2 Prior: -340.5150 Regularization: 0.0006
Iter: 2920 Training Loss: -340.6424
Negative Log Likelihood: 0.0809 Sigma2 Prior: -340.7239 Regularization: 0.0006
Iter: 2930 Training Loss: -340.8479
Negative Log Likelihood: 0.0788 Sigma2 Prior: -340.9274 Regularization: 0.0006
Iter: 2940 Training Loss: -341.0854
Negative Log Likelihood: 0.0818 Sigma2 Prior: -341.1679 Regularization: 0.0006
Iter: 2950 Training Loss: -341.2875
Negative Log Likelihood: 0.0841 Sigma2 Prior: -341.3722 Regularization: 0.0006
Iter: 2960 Training Loss: -341.4955
Negative Log Likelihood: 0.0735 Sigma2 Prior: -341.5696 Regularization: 0.0006
Iter: 2970 Training Loss: -341.5492
Negative Log Likelihood: 0.0834 Sigma2 Prior: -341.6332 Regularization: 0.0006
Iter: 2980 Training Loss: -341.8739
Negative Log Likelihood: 0.0832 Sigma2 Prior: -341.9577 Regularization: 0.0006
Iter: 2990 Training Loss: -342.0513
Negative Log Likelihood: 0.0910 Sigma2 Prior: -342.1429 Regularization: 0.0006
Iter: 3000 Training Loss: -342.4112
Negative Log Likelihood: 0.0705 Sigma2 Prior: -342.4822 Regularization: 0.0006
Iter: 3010 Training Loss: -342.4589
Negative Log Likelihood: 0.0912 Sigma2 Prior: -342.5507 Regularization: 0.0006
Iter: 3020 Training Loss: -342.6975
Negative Log Likelihood: 0.0910 Sigma2 Prior: -342.7891 Regularization: 0.0006
Iter: 3030 Training Loss: -342.9478
Negative Log Likelihood: 0.0793 Sigma2 Prior: -343.0277 Regularization: 0.0006
Iter: 3040 Training Loss: -343.1219
Negative Log Likelihood: 0.0702 Sigma2 Prior: -343.1927 Regularization: 0.0006
Iter: 3050 Training Loss: -343.4121
Negative Log Likelihood: 0.0857 Sigma2 Prior: -343.4984 Regularization: 0.0006
Iter: 3060 Training Loss: -343.6723
Negative Log Likelihood: 0.0723 Sigma2 Prior: -343.7451 Regularization: 0.0006
Iter: 3070 Training Loss: -343.7034
Negative Log Likelihood: 0.0809 Sigma2 Prior: -343.7849 Regularization: 0.0006
Iter: 3080 Training Loss: -344.0765
Negative Log Likelihood: 0.0720 Sigma2 Prior: -344.1491 Regularization: 0.0006
Iter: 3090 Training Loss: -344.1079
Negative Log Likelihood: 0.0903 Sigma2 Prior: -344.1988 Regularization: 0.0006
Iter: 3100 Training Loss: -344.2455
Negative Log Likelihood: 0.1008 Sigma2 Prior: -344.3469 Regularization: 0.0006
Iter: 3110 Training Loss: -344.5108
Negative Log Likelihood: 0.0887 Sigma2 Prior: -344.6002 Regularization: 0.0006
Iter: 3120 Training Loss: -344.6972
Negative Log Likelihood: 0.0947 Sigma2 Prior: -344.7925 Regularization: 0.0006
Iter: 3130 Training Loss: -344.9085
Negative Log Likelihood: 0.0901 Sigma2 Prior: -344.9992 Regularization: 0.0006
Iter: 3140 Training Loss: -345.1711
Negative Log Likelihood: 0.0918 Sigma2 Prior: -345.2635 Regularization: 0.0006
Iter: 3150 Training Loss: -345.3915
Negative Log Likelihood: 0.0888 Sigma2 Prior: -345.4809 Regularization: 0.0006
Iter: 3160 Training Loss: -345.7029
Negative Log Likelihood: 0.0774 Sigma2 Prior: -345.7809 Regularization: 0.0006
Iter: 3170 Training Loss: -345.9355
Negative Log Likelihood: 0.0710 Sigma2 Prior: -346.0070 Regularization: 0.0006
Iter: 3180 Training Loss: -346.1731
Negative Log Likelihood: 0.0907 Sigma2 Prior: -346.2644 Regularization: 0.0006
Iter: 3190 Training Loss: -346.1935
Negative Log Likelihood: 0.0967 Sigma2 Prior: -346.2908 Regularization: 0.0006
Iter: 3200 Training Loss: -346.3719
Negative Log Likelihood: 0.0928 Sigma2 Prior: -346.4653 Regularization: 0.0006
Iter: 3210 Training Loss: -346.6459
Negative Log Likelihood: 0.0895 Sigma2 Prior: -346.7360 Regularization: 0.0006
Iter: 3220 Training Loss: -346.8405
Negative Log Likelihood: 0.0979 Sigma2 Prior: -346.9389 Regularization: 0.0006
Iter: 3230 Training Loss: -347.1130
Negative Log Likelihood: 0.0755 Sigma2 Prior: -347.1891 Regularization: 0.0006
Iter: 3240 Training Loss: -347.2662
Negative Log Likelihood: 0.0694 Sigma2 Prior: -347.3362 Regularization: 0.0006
Iter: 3250 Training Loss: -347.6201
Negative Log Likelihood: 0.0836 Sigma2 Prior: -347.7043 Regularization: 0.0006
Iter: 3260 Training Loss: -347.7351
Negative Log Likelihood: 0.0927 Sigma2 Prior: -347.8284 Regularization: 0.0006
Iter: 3270 Training Loss: -347.9733
Negative Log Likelihood: 0.0899 Sigma2 Prior: -348.0638 Regularization: 0.0006
Iter: 3280 Training Loss: -348.1015
Negative Log Likelihood: 0.0921 Sigma2 Prior: -348.1942 Regularization: 0.0006
Iter: 3290 Training Loss: -348.5552
Negative Log Likelihood: 0.0796 Sigma2 Prior: -348.6354 Regularization: 0.0006
Iter: 3300 Training Loss: -348.5974
Negative Log Likelihood: 0.0818 Sigma2 Prior: -348.6798 Regularization: 0.0006
Iter: 3310 Training Loss: -348.8565
Negative Log Likelihood: 0.0937 Sigma2 Prior: -348.9508 Regularization: 0.0006
Iter: 3320 Training Loss: -349.0870
Negative Log Likelihood: 0.0645 Sigma2 Prior: -349.1521 Regularization: 0.0006
Iter: 3330 Training Loss: -349.3116
Negative Log Likelihood: 0.0800 Sigma2 Prior: -349.3923 Regularization: 0.0006
Iter: 3340 Training Loss: -349.4112
Negative Log Likelihood: 0.0986 Sigma2 Prior: -349.5104 Regularization: 0.0006
Iter: 3350 Training Loss: -349.6572
Negative Log Likelihood: 0.0921 Sigma2 Prior: -349.7498 Regularization: 0.0006
Iter: 3360 Training Loss: -350.0028
Negative Log Likelihood: 0.0986 Sigma2 Prior: -350.1021 Regularization: 0.0006
Iter: 3370 Training Loss: -350.2596
Negative Log Likelihood: 0.0786 Sigma2 Prior: -350.3387 Regularization: 0.0006
Iter: 3380 Training Loss: -350.3910
Negative Log Likelihood: 0.0763 Sigma2 Prior: -350.4679 Regularization: 0.0006
Iter: 3390 Training Loss: -350.6602
Negative Log Likelihood: 0.0738 Sigma2 Prior: -350.7346 Regularization: 0.0006
Iter: 3400 Training Loss: -350.9378
Negative Log Likelihood: 0.0675 Sigma2 Prior: -351.0058 Regularization: 0.0006
Iter: 3410 Training Loss: -351.2201
Negative Log Likelihood: 0.0811 Sigma2 Prior: -351.3018 Regularization: 0.0006
Iter: 3420 Training Loss: -351.3483
Negative Log Likelihood: 0.0854 Sigma2 Prior: -351.4343 Regularization: 0.0006
Iter: 3430 Training Loss: -351.5004
Negative Log Likelihood: 0.0895 Sigma2 Prior: -351.5904 Regularization: 0.0006
Iter: 3440 Training Loss: -351.8497
Negative Log Likelihood: 0.0938 Sigma2 Prior: -351.9441 Regularization: 0.0006
Iter: 3450 Training Loss: -351.9739
Negative Log Likelihood: 0.0902 Sigma2 Prior: -352.0647 Regularization: 0.0006
Iter: 3460 Training Loss: -352.2858
Negative Log Likelihood: 0.0800 Sigma2 Prior: -352.3664 Regularization: 0.0006
Iter: 3470 Training Loss: -352.4739
Negative Log Likelihood: 0.0971 Sigma2 Prior: -352.5716 Regularization: 0.0006
Iter: 3480 Training Loss: -352.6416
Negative Log Likelihood: 0.0902 Sigma2 Prior: -352.7323 Regularization: 0.0006
Iter: 3490 Training Loss: -352.8481
Negative Log Likelihood: 0.1006 Sigma2 Prior: -352.9493 Regularization: 0.0006
Iter: 3500 Training Loss: -353.2123
Negative Log Likelihood: 0.0856 Sigma2 Prior: -353.2986 Regularization: 0.0006
Iter: 3510 Training Loss: -353.3111
Negative Log Likelihood: 0.0899 Sigma2 Prior: -353.4016 Regularization: 0.0006
Iter: 3520 Training Loss: -353.7867
Negative Log Likelihood: 0.0651 Sigma2 Prior: -353.8524 Regularization: 0.0006
Iter: 3530 Training Loss: -353.7123
Negative Log Likelihood: 0.0787 Sigma2 Prior: -353.7916 Regularization: 0.0006
Iter: 3540 Training Loss: -354.0519
Negative Log Likelihood: 0.0750 Sigma2 Prior: -354.1275 Regularization: 0.0006
Iter: 3550 Training Loss: -354.2885
Negative Log Likelihood: 0.0792 Sigma2 Prior: -354.3683 Regularization: 0.0006
Iter: 3560 Training Loss: -354.4471
Negative Log Likelihood: 0.0836 Sigma2 Prior: -354.5313 Regularization: 0.0006
Iter: 3570 Training Loss: -354.8981
Negative Log Likelihood: 0.0792 Sigma2 Prior: -354.9778 Regularization: 0.0006
Iter: 3580 Training Loss: -354.9969
Negative Log Likelihood: 0.0918 Sigma2 Prior: -355.0893 Regularization: 0.0006
Iter: 3590 Training Loss: -355.3070
Negative Log Likelihood: 0.0767 Sigma2 Prior: -355.3844 Regularization: 0.0006
Iter: 3600 Training Loss: -355.5060
Negative Log Likelihood: 0.0846 Sigma2 Prior: -355.5912 Regularization: 0.0006
Iter: 3610 Training Loss: -355.6527
Negative Log Likelihood: 0.0754 Sigma2 Prior: -355.7286 Regularization: 0.0006
Iter: 3620 Training Loss: -355.9648
Negative Log Likelihood: 0.0739 Sigma2 Prior: -356.0393 Regularization: 0.0006
Iter: 3630 Training Loss: -356.1693
Negative Log Likelihood: 0.0930 Sigma2 Prior: -356.2628 Regularization: 0.0006
Iter: 3640 Training Loss: -356.4785
Negative Log Likelihood: 0.0737 Sigma2 Prior: -356.5527 Regularization: 0.0006
Iter: 3650 Training Loss: -356.5566
Negative Log Likelihood: 0.0916 Sigma2 Prior: -356.6488 Regularization: 0.0006
Iter: 3660 Training Loss: -356.9505
Negative Log Likelihood: 0.0823 Sigma2 Prior: -357.0334 Regularization: 0.0006
Iter: 3670 Training Loss: -357.0718
Negative Log Likelihood: 0.0966 Sigma2 Prior: -357.1689 Regularization: 0.0006
Iter: 3680 Training Loss: -357.4221
Negative Log Likelihood: 0.0886 Sigma2 Prior: -357.5112 Regularization: 0.0006
Iter: 3690 Training Loss: -357.5450
Negative Log Likelihood: 0.0945 Sigma2 Prior: -357.6401 Regularization: 0.0006
Iter: 3700 Training Loss: -357.8350
Negative Log Likelihood: 0.0805 Sigma2 Prior: -357.9160 Regularization: 0.0006
Iter: 3710 Training Loss: -358.0569
Negative Log Likelihood: 0.0835 Sigma2 Prior: -358.1410 Regularization: 0.0006
Iter: 3720 Training Loss: -358.3916
Negative Log Likelihood: 0.0857 Sigma2 Prior: -358.4779 Regularization: 0.0006
Iter: 3730 Training Loss: -358.6005
Negative Log Likelihood: 0.0752 Sigma2 Prior: -358.6762 Regularization: 0.0006
Iter: 3740 Training Loss: -358.9191
Negative Log Likelihood: 0.0636 Sigma2 Prior: -358.9833 Regularization: 0.0006
Iter: 3750 Training Loss: -358.9507
Negative Log Likelihood: 0.1008 Sigma2 Prior: -359.0520 Regularization: 0.0006
Iter: 3760 Training Loss: -359.2870
Negative Log Likelihood: 0.0947 Sigma2 Prior: -359.3823 Regularization: 0.0006
Iter: 3770 Training Loss: -359.4975
Negative Log Likelihood: 0.0966 Sigma2 Prior: -359.5947 Regularization: 0.0006
Iter: 3780 Training Loss: -359.6788
Negative Log Likelihood: 0.0864 Sigma2 Prior: -359.7658 Regularization: 0.0006
Iter: 3790 Training Loss: -359.6997
Negative Log Likelihood: 0.0928 Sigma2 Prior: -359.7931 Regularization: 0.0006
Iter: 3800 Training Loss: -360.1705
Negative Log Likelihood: 0.0910 Sigma2 Prior: -360.2621 Regularization: 0.0006
Iter: 3810 Training Loss: -360.4489
Negative Log Likelihood: 0.0973 Sigma2 Prior: -360.5467 Regularization: 0.0006
Iter: 3820 Training Loss: -360.5631
Negative Log Likelihood: 0.0884 Sigma2 Prior: -360.6521 Regularization: 0.0006
Iter: 3830 Training Loss: -360.9345
Negative Log Likelihood: 0.1065 Sigma2 Prior: -361.0415 Regularization: 0.0006
Iter: 3840 Training Loss: -361.2974
Negative Log Likelihood: 0.0773 Sigma2 Prior: -361.3753 Regularization: 0.0006
Iter: 3850 Training Loss: -361.5534
Negative Log Likelihood: 0.0782 Sigma2 Prior: -361.6322 Regularization: 0.0006
Iter: 3860 Training Loss: -361.6090
Negative Log Likelihood: 0.0929 Sigma2 Prior: -361.7025 Regularization: 0.0006
Iter: 3870 Training Loss: -361.9786
Negative Log Likelihood: 0.0928 Sigma2 Prior: -362.0719 Regularization: 0.0006
Iter: 3880 Training Loss: -362.2394
Negative Log Likelihood: 0.0693 Sigma2 Prior: -362.3093 Regularization: 0.0006
Iter: 3890 Training Loss: -362.3403
Negative Log Likelihood: 0.1023 Sigma2 Prior: -362.4431 Regularization: 0.0006
Iter: 3900 Training Loss: -362.6184
Negative Log Likelihood: 0.1001 Sigma2 Prior: -362.7191 Regularization: 0.0006
Iter: 3910 Training Loss: -362.8920
Negative Log Likelihood: 0.1025 Sigma2 Prior: -362.9951 Regularization: 0.0006
Iter: 3920 Training Loss: -363.3847
Negative Log Likelihood: 0.0800 Sigma2 Prior: -363.4652 Regularization: 0.0006
Iter: 3930 Training Loss: -363.5575
Negative Log Likelihood: 0.0787 Sigma2 Prior: -363.6368 Regularization: 0.0006
Iter: 3940 Training Loss: -363.6991
Negative Log Likelihood: 0.0762 Sigma2 Prior: -363.7758 Regularization: 0.0006
Iter: 3950 Training Loss: -363.8205
Negative Log Likelihood: 0.0952 Sigma2 Prior: -363.9162 Regularization: 0.0006
Iter: 3960 Training Loss: -364.2271
Negative Log Likelihood: 0.0823 Sigma2 Prior: -364.3100 Regularization: 0.0006
Iter: 3970 Training Loss: -364.4539
Negative Log Likelihood: 0.0817 Sigma2 Prior: -364.5362 Regularization: 0.0006
Iter: 3980 Training Loss: -364.7636
Negative Log Likelihood: 0.0860 Sigma2 Prior: -364.8502 Regularization: 0.0006
Iter: 3990 Training Loss: -364.8619
Negative Log Likelihood: 0.0832 Sigma2 Prior: -364.9456 Regularization: 0.0006
Iter: 4000 Training Loss: -365.1061
Negative Log Likelihood: 0.0939 Sigma2 Prior: -365.2006 Regularization: 0.0006
Iter: 4010 Training Loss: -365.4240
Negative Log Likelihood: 0.0776 Sigma2 Prior: -365.5022 Regularization: 0.0006
Iter: 4020 Training Loss: -365.6211
Negative Log Likelihood: 0.0867 Sigma2 Prior: -365.7084 Regularization: 0.0006
Iter: 4030 Training Loss: -365.8545
Negative Log Likelihood: 0.0919 Sigma2 Prior: -365.9469 Regularization: 0.0006
Iter: 4040 Training Loss: -366.1370
Negative Log Likelihood: 0.1014 Sigma2 Prior: -366.2390 Regularization: 0.0006
Iter: 4050 Training Loss: -366.5154
Negative Log Likelihood: 0.0921 Sigma2 Prior: -366.6080 Regularization: 0.0006
Iter: 4060 Training Loss: -366.6531
Negative Log Likelihood: 0.0965 Sigma2 Prior: -366.7502 Regularization: 0.0006
Iter: 4070 Training Loss: -366.9746
Negative Log Likelihood: 0.0959 Sigma2 Prior: -367.0711 Regularization: 0.0006
Iter: 4080 Training Loss: -367.2648
Negative Log Likelihood: 0.1008 Sigma2 Prior: -367.3662 Regularization: 0.0006
Iter: 4090 Training Loss: -367.3874
Negative Log Likelihood: 0.1059 Sigma2 Prior: -367.4939 Regularization: 0.0006
Iter: 4100 Training Loss: -367.7430
Negative Log Likelihood: 0.0817 Sigma2 Prior: -367.8252 Regularization: 0.0006
Iter: 4110 Training Loss: -368.1078
Negative Log Likelihood: 0.0895 Sigma2 Prior: -368.1979 Regularization: 0.0006
Iter: 4120 Training Loss: -368.2145
Negative Log Likelihood: 0.1074 Sigma2 Prior: -368.3225 Regularization: 0.0006
Iter: 4130 Training Loss: -368.5507
Negative Log Likelihood: 0.0971 Sigma2 Prior: -368.6483 Regularization: 0.0006
Iter: 4140 Training Loss: -368.7698
Negative Log Likelihood: 0.0998 Sigma2 Prior: -368.8702 Regularization: 0.0006
Iter: 4150 Training Loss: -369.0149
Negative Log Likelihood: 0.0961 Sigma2 Prior: -369.1115 Regularization: 0.0006
Iter: 4160 Training Loss: -369.3386
Negative Log Likelihood: 0.0944 Sigma2 Prior: -369.4337 Regularization: 0.0006
Iter: 4170 Training Loss: -369.5358
Negative Log Likelihood: 0.1052 Sigma2 Prior: -369.6416 Regularization: 0.0006
Iter: 4180 Training Loss: -369.8253
Negative Log Likelihood: 0.0969 Sigma2 Prior: -369.9228 Regularization: 0.0006
Iter: 4190 Training Loss: -370.1805
Negative Log Likelihood: 0.0888 Sigma2 Prior: -370.2699 Regularization: 0.0006
Iter: 4200 Training Loss: -370.3444
Negative Log Likelihood: 0.0858 Sigma2 Prior: -370.4308 Regularization: 0.0006
Iter: 4210 Training Loss: -370.5901
Negative Log Likelihood: 0.0880 Sigma2 Prior: -370.6787 Regularization: 0.0006
Iter: 4220 Training Loss: -370.9249
Negative Log Likelihood: 0.0813 Sigma2 Prior: -371.0067 Regularization: 0.0006
Iter: 4230 Training Loss: -371.2018
Negative Log Likelihood: 0.1058 Sigma2 Prior: -371.3082 Regularization: 0.0006
Iter: 4240 Training Loss: -371.2895
Negative Log Likelihood: 0.0918 Sigma2 Prior: -371.3819 Regularization: 0.0006
Iter: 4250 Training Loss: -371.7366
Negative Log Likelihood: 0.0941 Sigma2 Prior: -371.8313 Regularization: 0.0006
Iter: 4260 Training Loss: -371.8475
Negative Log Likelihood: 0.0925 Sigma2 Prior: -371.9406 Regularization: 0.0006
Iter: 4270 Training Loss: -372.2200
Negative Log Likelihood: 0.0970 Sigma2 Prior: -372.3175 Regularization: 0.0006
Iter: 4280 Training Loss: -372.4836
Negative Log Likelihood: 0.1070 Sigma2 Prior: -372.5912 Regularization: 0.0006
Iter: 4290 Training Loss: -372.7803
Negative Log Likelihood: 0.0925 Sigma2 Prior: -372.8734 Regularization: 0.0006
Iter: 4300 Training Loss: -373.0276
Negative Log Likelihood: 0.1005 Sigma2 Prior: -373.1287 Regularization: 0.0006
Iter: 4310 Training Loss: -373.3235
Negative Log Likelihood: 0.0949 Sigma2 Prior: -373.4189 Regularization: 0.0006
Iter: 4320 Training Loss: -373.7006
Negative Log Likelihood: 0.1089 Sigma2 Prior: -373.8101 Regularization: 0.0006
Iter: 4330 Training Loss: -373.8661
Negative Log Likelihood: 0.0991 Sigma2 Prior: -373.9658 Regularization: 0.0006
Iter: 4340 Training Loss: -374.1928
Negative Log Likelihood: 0.0956 Sigma2 Prior: -374.2890 Regularization: 0.0006
Iter: 4350 Training Loss: -374.3927
Negative Log Likelihood: 0.0975 Sigma2 Prior: -374.4908 Regularization: 0.0006
Iter: 4360 Training Loss: -374.8104
Negative Log Likelihood: 0.0803 Sigma2 Prior: -374.8913 Regularization: 0.0006
Iter: 4370 Training Loss: -375.0852
Negative Log Likelihood: 0.0950 Sigma2 Prior: -375.1808 Regularization: 0.0006
Iter: 4380 Training Loss: -375.2356
Negative Log Likelihood: 0.1036 Sigma2 Prior: -375.3398 Regularization: 0.0006
Iter: 4390 Training Loss: -375.4062
Negative Log Likelihood: 0.1117 Sigma2 Prior: -375.5186 Regularization: 0.0006
Iter: 4400 Training Loss: -375.7696
Negative Log Likelihood: 0.1150 Sigma2 Prior: -375.8852 Regularization: 0.0006
Iter: 4410 Training Loss: -375.9694
Negative Log Likelihood: 0.1042 Sigma2 Prior: -376.0742 Regularization: 0.0006
Iter: 4420 Training Loss: -376.5497
Negative Log Likelihood: 0.0912 Sigma2 Prior: -376.6415 Regularization: 0.0006
Iter: 4430 Training Loss: -376.7237
Negative Log Likelihood: 0.0907 Sigma2 Prior: -376.8150 Regularization: 0.0006
Iter: 4440 Training Loss: -376.8562
Negative Log Likelihood: 0.1078 Sigma2 Prior: -376.9645 Regularization: 0.0006
Iter: 4450 Training Loss: -377.2358
Negative Log Likelihood: 0.0951 Sigma2 Prior: -377.3315 Regularization: 0.0006
Iter: 4460 Training Loss: -377.4901
Negative Log Likelihood: 0.0887 Sigma2 Prior: -377.5794 Regularization: 0.0006
Iter: 4470 Training Loss: -377.8535
Negative Log Likelihood: 0.1062 Sigma2 Prior: -377.9603 Regularization: 0.0006
Iter: 4480 Training Loss: -378.0938
Negative Log Likelihood: 0.0915 Sigma2 Prior: -378.1859 Regularization: 0.0006
Iter: 4490 Training Loss: -378.2451
Negative Log Likelihood: 0.1108 Sigma2 Prior: -378.3565 Regularization: 0.0006
Iter: 4500 Training Loss: -378.6360
Negative Log Likelihood: 0.0950 Sigma2 Prior: -378.7316 Regularization: 0.0006
Iter: 4510 Training Loss: -378.7133
Negative Log Likelihood: 0.1057 Sigma2 Prior: -378.8196 Regularization: 0.0006
Iter: 4520 Training Loss: -379.2616
Negative Log Likelihood: 0.1036 Sigma2 Prior: -379.3658 Regularization: 0.0006
Iter: 4530 Training Loss: -379.5958
Negative Log Likelihood: 0.0926 Sigma2 Prior: -379.6890 Regularization: 0.0006
Iter: 4540 Training Loss: -379.6963
Negative Log Likelihood: 0.0810 Sigma2 Prior: -379.7779 Regularization: 0.0006
Iter: 4550 Training Loss: -379.9771
Negative Log Likelihood: 0.1162 Sigma2 Prior: -380.0939 Regularization: 0.0006
Iter: 4560 Training Loss: -380.2724
Negative Log Likelihood: 0.1055 Sigma2 Prior: -380.3784 Regularization: 0.0006
Iter: 4570 Training Loss: -380.6499
Negative Log Likelihood: 0.0971 Sigma2 Prior: -380.7476 Regularization: 0.0006
Iter: 4580 Training Loss: -380.9928
Negative Log Likelihood: 0.0990 Sigma2 Prior: -381.0924 Regularization: 0.0006
Iter: 4590 Training Loss: -381.1552
Negative Log Likelihood: 0.1055 Sigma2 Prior: -381.2612 Regularization: 0.0006
Iter: 4600 Training Loss: -381.7587
Negative Log Likelihood: 0.0832 Sigma2 Prior: -381.8424 Regularization: 0.0006
Iter: 4610 Training Loss: -381.9669
Negative Log Likelihood: 0.0983 Sigma2 Prior: -382.0659 Regularization: 0.0006
Iter: 4620 Training Loss: -382.1765
Negative Log Likelihood: 0.1061 Sigma2 Prior: -382.2832 Regularization: 0.0006
Iter: 4630 Training Loss: -382.6235
Negative Log Likelihood: 0.0816 Sigma2 Prior: -382.7057 Regularization: 0.0006
Iter: 4640 Training Loss: -382.6602
Negative Log Likelihood: 0.0789 Sigma2 Prior: -382.7397 Regularization: 0.0006
Iter: 4650 Training Loss: -382.8453
Negative Log Likelihood: 0.1186 Sigma2 Prior: -382.9645 Regularization: 0.0006
Iter: 4660 Training Loss: -383.3419
Negative Log Likelihood: 0.0990 Sigma2 Prior: -383.4416 Regularization: 0.0006
Iter: 4670 Training Loss: -383.5139
Negative Log Likelihood: 0.1160 Sigma2 Prior: -383.6305 Regularization: 0.0006
Iter: 4680 Training Loss: -383.7639
Negative Log Likelihood: 0.1107 Sigma2 Prior: -383.8752 Regularization: 0.0006
Iter: 4690 Training Loss: -384.3827
Negative Log Likelihood: 0.1043 Sigma2 Prior: -384.4876 Regularization: 0.0006
Iter: 4700 Training Loss: -384.4183
Negative Log Likelihood: 0.1162 Sigma2 Prior: -384.5350 Regularization: 0.0006
Iter: 4710 Training Loss: -384.5670
Negative Log Likelihood: 0.1054 Sigma2 Prior: -384.6729 Regularization: 0.0006
Iter: 4720 Training Loss: -384.9727
Negative Log Likelihood: 0.1133 Sigma2 Prior: -385.0866 Regularization: 0.0006
Iter: 4730 Training Loss: -385.4466
Negative Log Likelihood: 0.0959 Sigma2 Prior: -385.5431 Regularization: 0.0006
Iter: 4740 Training Loss: -385.7043
Negative Log Likelihood: 0.1147 Sigma2 Prior: -385.8196 Regularization: 0.0006
Iter: 4750 Training Loss: -386.0374
Negative Log Likelihood: 0.1029 Sigma2 Prior: -386.1409 Regularization: 0.0006
Iter: 4760 Training Loss: -386.3611
Negative Log Likelihood: 0.1073 Sigma2 Prior: -386.4690 Regularization: 0.0006
Iter: 4770 Training Loss: -386.6465
Negative Log Likelihood: 0.1023 Sigma2 Prior: -386.7495 Regularization: 0.0006
Iter: 4780 Training Loss: -386.8588
Negative Log Likelihood: 0.1128 Sigma2 Prior: -386.9721 Regularization: 0.0006
Iter: 4790 Training Loss: -387.1997
Negative Log Likelihood: 0.0975 Sigma2 Prior: -387.2978 Regularization: 0.0006
Iter: 4800 Training Loss: -387.3742
Negative Log Likelihood: 0.1289 Sigma2 Prior: -387.5037 Regularization: 0.0006
Iter: 4810 Training Loss: -387.9327
Negative Log Likelihood: 0.1106 Sigma2 Prior: -388.0438 Regularization: 0.0006
Iter: 4820 Training Loss: -388.2476
Negative Log Likelihood: 0.0944 Sigma2 Prior: -388.3426 Regularization: 0.0006
Iter: 4830 Training Loss: -388.4312
Negative Log Likelihood: 0.1183 Sigma2 Prior: -388.5501 Regularization: 0.0006
Iter: 4840 Training Loss: -388.8065
Negative Log Likelihood: 0.1132 Sigma2 Prior: -388.9203 Regularization: 0.0006
Iter: 4850 Training Loss: -389.2628
Negative Log Likelihood: 0.1092 Sigma2 Prior: -389.3727 Regularization: 0.0006
Iter: 4860 Training Loss: -389.4238
Negative Log Likelihood: 0.1022 Sigma2 Prior: -389.5266 Regularization: 0.0006
Iter: 4870 Training Loss: -389.9342
Negative Log Likelihood: 0.0921 Sigma2 Prior: -390.0269 Regularization: 0.0006
Iter: 4880 Training Loss: -389.9544
Negative Log Likelihood: 0.1334 Sigma2 Prior: -390.0884 Regularization: 0.0006
Iter: 4890 Training Loss: -390.5289
Negative Log Likelihood: 0.1075 Sigma2 Prior: -390.6370 Regularization: 0.0006
Iter: 4900 Training Loss: -390.7629
Negative Log Likelihood: 0.1143 Sigma2 Prior: -390.8777 Regularization: 0.0006
Iter: 4910 Training Loss: -391.0119
Negative Log Likelihood: 0.1044 Sigma2 Prior: -391.1169 Regularization: 0.0006
Iter: 4920 Training Loss: -391.1551
Negative Log Likelihood: 0.1282 Sigma2 Prior: -391.2839 Regularization: 0.0006
Iter: 4930 Training Loss: -391.7835
Negative Log Likelihood: 0.1053 Sigma2 Prior: -391.8894 Regularization: 0.0006
Iter: 4940 Training Loss: -392.0193
Negative Log Likelihood: 0.1108 Sigma2 Prior: -392.1307 Regularization: 0.0006
Iter: 4950 Training Loss: -392.2982
Negative Log Likelihood: 0.0984 Sigma2 Prior: -392.3972 Regularization: 0.0006
Iter: 4960 Training Loss: -392.6210
Negative Log Likelihood: 0.1123 Sigma2 Prior: -392.7339 Regularization: 0.0006
Iter: 4970 Training Loss: -392.9786
Negative Log Likelihood: 0.1122 Sigma2 Prior: -393.0914 Regularization: 0.0006
Iter: 4980 Training Loss: -393.1955
Negative Log Likelihood: 0.1161 Sigma2 Prior: -393.3122 Regularization: 0.0006
Iter: 4990 Training Loss: -393.5749
Negative Log Likelihood: 0.1255 Sigma2 Prior: -393.7010 Regularization: 0.0006
Iter: 5000 Training Loss: -393.9783
Negative Log Likelihood: 0.0997 Sigma2 Prior: -394.0786 Regularization: 0.0006
Iter: 5010 Training Loss: -394.3790
Negative Log Likelihood: 0.1184 Sigma2 Prior: -394.4980 Regularization: 0.0006
Iter: 5020 Training Loss: -394.4105
Negative Log Likelihood: 0.1181 Sigma2 Prior: -394.5291 Regularization: 0.0006
Iter: 5030 Training Loss: -395.0344
Negative Log Likelihood: 0.1167 Sigma2 Prior: -395.1517 Regularization: 0.0006
Iter: 5040 Training Loss: -395.2696
Negative Log Likelihood: 0.1167 Sigma2 Prior: -395.3869 Regularization: 0.0006
Iter: 5050 Training Loss: -395.7037
Negative Log Likelihood: 0.1067 Sigma2 Prior: -395.8110 Regularization: 0.0006
Iter: 5060 Training Loss: -395.9413
Negative Log Likelihood: 0.1050 Sigma2 Prior: -396.0469 Regularization: 0.0006
Iter: 5070 Training Loss: -396.4239
Negative Log Likelihood: 0.1032 Sigma2 Prior: -396.5277 Regularization: 0.0006
Iter: 5080 Training Loss: -396.6469
Negative Log Likelihood: 0.0961 Sigma2 Prior: -396.7436 Regularization: 0.0006
Iter: 5090 Training Loss: -397.1467
Negative Log Likelihood: 0.1082 Sigma2 Prior: -397.2554 Regularization: 0.0006
Iter: 5100 Training Loss: -397.1862
Negative Log Likelihood: 0.1292 Sigma2 Prior: -397.3159 Regularization: 0.0006
Iter: 5110 Training Loss: -397.6077
Negative Log Likelihood: 0.1071 Sigma2 Prior: -397.7154 Regularization: 0.0006
Iter: 5120 Training Loss: -397.9876
Negative Log Likelihood: 0.1205 Sigma2 Prior: -398.1087 Regularization: 0.0006
Iter: 5130 Training Loss: -398.4181
Negative Log Likelihood: 0.1220 Sigma2 Prior: -398.5407 Regularization: 0.0006
Iter: 5140 Training Loss: -398.6250
Negative Log Likelihood: 0.0941 Sigma2 Prior: -398.7197 Regularization: 0.0006
Iter: 5150 Training Loss: -398.8534
Negative Log Likelihood: 0.1289 Sigma2 Prior: -398.9828 Regularization: 0.0006
Iter: 5160 Training Loss: -399.1713
Negative Log Likelihood: 0.1263 Sigma2 Prior: -399.2982 Regularization: 0.0006
Iter: 5170 Training Loss: -399.8891
Negative Log Likelihood: 0.0780 Sigma2 Prior: -399.9677 Regularization: 0.0006
Iter: 5180 Training Loss: -399.9597
Negative Log Likelihood: 0.1179 Sigma2 Prior: -400.0782 Regularization: 0.0006
Iter: 5190 Training Loss: -400.4612
Negative Log Likelihood: 0.1350 Sigma2 Prior: -400.5967 Regularization: 0.0006
Iter: 5200 Training Loss: -400.7443
Negative Log Likelihood: 0.1157 Sigma2 Prior: -400.8606 Regularization: 0.0006
Iter: 5210 Training Loss: -401.3030
Negative Log Likelihood: 0.1164 Sigma2 Prior: -401.4200 Regularization: 0.0006
Iter: 5220 Training Loss: -401.5550
Negative Log Likelihood: 0.0966 Sigma2 Prior: -401.6521 Regularization: 0.0006
Iter: 5230 Training Loss: -401.9971
Negative Log Likelihood: 0.1103 Sigma2 Prior: -402.1080 Regularization: 0.0006
Iter: 5240 Training Loss: -402.1436
Negative Log Likelihood: 0.1121 Sigma2 Prior: -402.2563 Regularization: 0.0006
Iter: 5250 Training Loss: -402.6797
Negative Log Likelihood: 0.0921 Sigma2 Prior: -402.7723 Regularization: 0.0006
Iter: 5260 Training Loss: -403.1944
Negative Log Likelihood: 0.1119 Sigma2 Prior: -403.3068 Regularization: 0.0006
Iter: 5270 Training Loss: -403.2489
Negative Log Likelihood: 0.1303 Sigma2 Prior: -403.3798 Regularization: 0.0006
Iter: 5280 Training Loss: -403.5502
Negative Log Likelihood: 0.1204 Sigma2 Prior: -403.6711 Regularization: 0.0006
Iter: 5290 Training Loss: -403.9757
Negative Log Likelihood: 0.1153 Sigma2 Prior: -404.0916 Regularization: 0.0006
Iter: 5300 Training Loss: -404.5558
Negative Log Likelihood: 0.0917 Sigma2 Prior: -404.6480 Regularization: 0.0006
Iter: 5310 Training Loss: -404.6562
Negative Log Likelihood: 0.1091 Sigma2 Prior: -404.7658 Regularization: 0.0006
Iter: 5320 Training Loss: -404.9410
Negative Log Likelihood: 0.1225 Sigma2 Prior: -405.0641 Regularization: 0.0006
Iter: 5330 Training Loss: -405.5094
Negative Log Likelihood: 0.1054 Sigma2 Prior: -405.6154 Regularization: 0.0006
Iter: 5340 Training Loss: -405.7184
Negative Log Likelihood: 0.1026 Sigma2 Prior: -405.8216 Regularization: 0.0006
Iter: 5350 Training Loss: -405.9475
Negative Log Likelihood: 0.1248 Sigma2 Prior: -406.0729 Regularization: 0.0006
Iter: 5360 Training Loss: -406.5901
Negative Log Likelihood: 0.1238 Sigma2 Prior: -406.7145 Regularization: 0.0006
Iter: 5370 Training Loss: -406.7744
Negative Log Likelihood: 0.1133 Sigma2 Prior: -406.8884 Regularization: 0.0006
Iter: 5380 Training Loss: -407.2816
Negative Log Likelihood: 0.1286 Sigma2 Prior: -407.4108 Regularization: 0.0006
Iter: 5390 Training Loss: -407.5842
Negative Log Likelihood: 0.1327 Sigma2 Prior: -407.7176 Regularization: 0.0006
Iter: 5400 Training Loss: -407.9548
Negative Log Likelihood: 0.1125 Sigma2 Prior: -408.0680 Regularization: 0.0006
Iter: 5410 Training Loss: -408.2942
Negative Log Likelihood: 0.1122 Sigma2 Prior: -408.4070 Regularization: 0.0006
Iter: 5420 Training Loss: -408.7424
Negative Log Likelihood: 0.1146 Sigma2 Prior: -408.8575 Regularization: 0.0006
Iter: 5430 Training Loss: -409.2805
Negative Log Likelihood: 0.1214 Sigma2 Prior: -409.4025 Regularization: 0.0006
Iter: 5440 Training Loss: -409.7651
Negative Log Likelihood: 0.1129 Sigma2 Prior: -409.8787 Regularization: 0.0006
Iter: 5450 Training Loss: -409.9541
Negative Log Likelihood: 0.1111 Sigma2 Prior: -410.0658 Regularization: 0.0006
Iter: 5460 Training Loss: -410.3165
Negative Log Likelihood: 0.1058 Sigma2 Prior: -410.4230 Regularization: 0.0006
Iter: 5470 Training Loss: -410.5042
Negative Log Likelihood: 0.1307 Sigma2 Prior: -410.6355 Regularization: 0.0006
Iter: 5480 Training Loss: -411.2329
Negative Log Likelihood: 0.1204 Sigma2 Prior: -411.3539 Regularization: 0.0006
Iter: 5490 Training Loss: -411.3105
Negative Log Likelihood: 0.1478 Sigma2 Prior: -411.4589 Regularization: 0.0006
Iter: 5500 Training Loss: -411.9483
Negative Log Likelihood: 0.1141 Sigma2 Prior: -412.0630 Regularization: 0.0006
Iter: 5510 Training Loss: -412.2280
Negative Log Likelihood: 0.1262 Sigma2 Prior: -412.3549 Regularization: 0.0006
Iter: 5520 Training Loss: -412.5760
Negative Log Likelihood: 0.1167 Sigma2 Prior: -412.6933 Regularization: 0.0006
Iter: 5530 Training Loss: -413.1524
Negative Log Likelihood: 0.1261 Sigma2 Prior: -413.2791 Regularization: 0.0006
Iter: 5540 Training Loss: -413.7780
Negative Log Likelihood: 0.0990 Sigma2 Prior: -413.8776 Regularization: 0.0006
Iter: 5550 Training Loss: -413.7653
Negative Log Likelihood: 0.1198 Sigma2 Prior: -413.8857 Regularization: 0.0006
Iter: 5560 Training Loss: -413.8706
Negative Log Likelihood: 0.1469 Sigma2 Prior: -414.0182 Regularization: 0.0006
Iter: 5570 Training Loss: -414.8941
Negative Log Likelihood: 0.1011 Sigma2 Prior: -414.9958 Regularization: 0.0006
Iter: 5580 Training Loss: -414.8274
Negative Log Likelihood: 0.1322 Sigma2 Prior: -414.9602 Regularization: 0.0006
Iter: 5590 Training Loss: -415.4633
Negative Log Likelihood: 0.1205 Sigma2 Prior: -415.5844 Regularization: 0.0006
Iter: 5600 Training Loss: -415.6257
Negative Log Likelihood: 0.1267 Sigma2 Prior: -415.7530 Regularization: 0.0006
Iter: 5610 Training Loss: -416.0840
Negative Log Likelihood: 0.1242 Sigma2 Prior: -416.2088 Regularization: 0.0006
Iter: 5620 Training Loss: -416.5416
Negative Log Likelihood: 0.1288 Sigma2 Prior: -416.6710 Regularization: 0.0006
Iter: 5630 Training Loss: -417.0602
Negative Log Likelihood: 0.1061 Sigma2 Prior: -417.1669 Regularization: 0.0006
Iter: 5640 Training Loss: -417.4150
Negative Log Likelihood: 0.1187 Sigma2 Prior: -417.5343 Regularization: 0.0006
Iter: 5650 Training Loss: -417.8894
Negative Log Likelihood: 0.1222 Sigma2 Prior: -418.0123 Regularization: 0.0006
Iter: 5660 Training Loss: -418.1009
Negative Log Likelihood: 0.1400 Sigma2 Prior: -418.2415 Regularization: 0.0006
Iter: 5670 Training Loss: -418.5890
Negative Log Likelihood: 0.1355 Sigma2 Prior: -418.7251 Regularization: 0.0006
Iter: 5680 Training Loss: -418.8438
Negative Log Likelihood: 0.1258 Sigma2 Prior: -418.9703 Regularization: 0.0006
Iter: 5690 Training Loss: -419.5866
Negative Log Likelihood: 0.1334 Sigma2 Prior: -419.7206 Regularization: 0.0006
Iter: 5700 Training Loss: -419.9987
Negative Log Likelihood: 0.1190 Sigma2 Prior: -420.1184 Regularization: 0.0006
Iter: 5710 Training Loss: -420.2760
Negative Log Likelihood: 0.1288 Sigma2 Prior: -420.4055 Regularization: 0.0006
Iter: 5720 Training Loss: -420.6576
Negative Log Likelihood: 0.1456 Sigma2 Prior: -420.8038 Regularization: 0.0006
Iter: 5730 Training Loss: -421.0495
Negative Log Likelihood: 0.1269 Sigma2 Prior: -421.1770 Regularization: 0.0006
Iter: 5740 Training Loss: -421.6408
Negative Log Likelihood: 0.1044 Sigma2 Prior: -421.7458 Regularization: 0.0006
Iter: 5750 Training Loss: -421.7416
Negative Log Likelihood: 0.1472 Sigma2 Prior: -421.8894 Regularization: 0.0006
Iter: 5760 Training Loss: -422.6663
Negative Log Likelihood: 0.1319 Sigma2 Prior: -422.7989 Regularization: 0.0006
Iter: 5770 Training Loss: -422.9701
Negative Log Likelihood: 0.1048 Sigma2 Prior: -423.0756 Regularization: 0.0006
Iter: 5780 Training Loss: -423.2651
Negative Log Likelihood: 0.1427 Sigma2 Prior: -423.4085 Regularization: 0.0006
Iter: 5790 Training Loss: -423.3942
Negative Log Likelihood: 0.1383 Sigma2 Prior: -423.5331 Regularization: 0.0006
Iter: 5800 Training Loss: -424.1073
Negative Log Likelihood: 0.1367 Sigma2 Prior: -424.2446 Regularization: 0.0006
Iter: 5810 Training Loss: -424.7145
Negative Log Likelihood: 0.1126 Sigma2 Prior: -424.8277 Regularization: 0.0006
Iter: 5820 Training Loss: -424.7959
Negative Log Likelihood: 0.1555 Sigma2 Prior: -424.9520 Regularization: 0.0006
Iter: 5830 Training Loss: -425.3795
Negative Log Likelihood: 0.1439 Sigma2 Prior: -425.5240 Regularization: 0.0006
Iter: 5840 Training Loss: -426.0586
Negative Log Likelihood: 0.1261 Sigma2 Prior: -426.1854 Regularization: 0.0006
Iter: 5850 Training Loss: -426.5126
Negative Log Likelihood: 0.1557 Sigma2 Prior: -426.6689 Regularization: 0.0006
Iter: 5860 Training Loss: -426.7415
Negative Log Likelihood: 0.1509 Sigma2 Prior: -426.8930 Regularization: 0.0006
Iter: 5870 Training Loss: -427.3567
Negative Log Likelihood: 0.1282 Sigma2 Prior: -427.4855 Regularization: 0.0006
Iter: 5880 Training Loss: -427.5918
Negative Log Likelihood: 0.1408 Sigma2 Prior: -427.7332 Regularization: 0.0006
Iter: 5890 Training Loss: -427.9727
Negative Log Likelihood: 0.1592 Sigma2 Prior: -428.1326 Regularization: 0.0006
Iter: 5900 Training Loss: -428.6067
Negative Log Likelihood: 0.1373 Sigma2 Prior: -428.7446 Regularization: 0.0006
Iter: 5910 Training Loss: -429.3976
Negative Log Likelihood: 0.1072 Sigma2 Prior: -429.5055 Regularization: 0.0006
Iter: 5920 Training Loss: -429.4613
Negative Log Likelihood: 0.1573 Sigma2 Prior: -429.6192 Regularization: 0.0006
Iter: 5930 Training Loss: -429.8458
Negative Log Likelihood: 0.1275 Sigma2 Prior: -429.9739 Regularization: 0.0006
Iter: 5940 Training Loss: -430.1740
Negative Log Likelihood: 0.1768 Sigma2 Prior: -430.3513 Regularization: 0.0006
Iter: 5950 Training Loss: -430.9435
Negative Log Likelihood: 0.1345 Sigma2 Prior: -431.0786 Regularization: 0.0006
Iter: 5960 Training Loss: -431.4669
Negative Log Likelihood: 0.1358 Sigma2 Prior: -431.6034 Regularization: 0.0006
Iter: 5970 Training Loss: -431.8300
Negative Log Likelihood: 0.1268 Sigma2 Prior: -431.9575 Regularization: 0.0006
Iter: 5980 Training Loss: -431.9957
Negative Log Likelihood: 0.1432 Sigma2 Prior: -432.1395 Regularization: 0.0006
Iter: 5990 Training Loss: -432.8139
Negative Log Likelihood: 0.1381 Sigma2 Prior: -432.9526 Regularization: 0.0006
Iter: 6000 Training Loss: -433.3112
Negative Log Likelihood: 0.1596 Sigma2 Prior: -433.4714 Regularization: 0.0006
Iter: 6010 Training Loss: -433.5983
Negative Log Likelihood: 0.1529 Sigma2 Prior: -433.7518 Regularization: 0.0006
Iter: 6020 Training Loss: -434.2339
Negative Log Likelihood: 0.1507 Sigma2 Prior: -434.3852 Regularization: 0.0006
Iter: 6030 Training Loss: -434.8337
Negative Log Likelihood: 0.1233 Sigma2 Prior: -434.9576 Regularization: 0.0006
Iter: 6040 Training Loss: -435.0690
Negative Log Likelihood: 0.1598 Sigma2 Prior: -435.2294 Regularization: 0.0006
Iter: 6050 Training Loss: -435.4714
Negative Log Likelihood: 0.1510 Sigma2 Prior: -435.6230 Regularization: 0.0006
Iter: 6060 Training Loss: -436.1751
Negative Log Likelihood: 0.1475 Sigma2 Prior: -436.3232 Regularization: 0.0006
Iter: 6070 Training Loss: -436.8811
Negative Log Likelihood: 0.1232 Sigma2 Prior: -437.0050 Regularization: 0.0006
Iter: 6080 Training Loss: -437.1113
Negative Log Likelihood: 0.1473 Sigma2 Prior: -437.2592 Regularization: 0.0006
Iter: 6090 Training Loss: -437.5331
Negative Log Likelihood: 0.1211 Sigma2 Prior: -437.6548 Regularization: 0.0006
Iter: 6100 Training Loss: -437.9757
Negative Log Likelihood: 0.1584 Sigma2 Prior: -438.1348 Regularization: 0.0006
Iter: 6110 Training Loss: -438.4420
Negative Log Likelihood: 0.1716 Sigma2 Prior: -438.6142 Regularization: 0.0006
Iter: 6120 Training Loss: -438.4365
Negative Log Likelihood: 0.1833 Sigma2 Prior: -438.6204 Regularization: 0.0006
Iter: 6130 Training Loss: -439.4770
Negative Log Likelihood: 0.1540 Sigma2 Prior: -439.6316 Regularization: 0.0006
Iter: 6140 Training Loss: -439.8889
Negative Log Likelihood: 0.1384 Sigma2 Prior: -440.0279 Regularization: 0.0006
Iter: 6150 Training Loss: -440.4006
Negative Log Likelihood: 0.1242 Sigma2 Prior: -440.5254 Regularization: 0.0006
Iter: 6160 Training Loss: -441.0136
Negative Log Likelihood: 0.1475 Sigma2 Prior: -441.1617 Regularization: 0.0006
Iter: 6170 Training Loss: -441.3311
Negative Log Likelihood: 0.1687 Sigma2 Prior: -441.5004 Regularization: 0.0006
Iter: 6180 Training Loss: -442.2119
Negative Log Likelihood: 0.1482 Sigma2 Prior: -442.3607 Regularization: 0.0006
Iter: 6190 Training Loss: -442.2280
Negative Log Likelihood: 0.1721 Sigma2 Prior: -442.4007 Regularization: 0.0006
Iter: 6200 Training Loss: -442.9086
Negative Log Likelihood: 0.1766 Sigma2 Prior: -443.0858 Regularization: 0.0006
Iter: 6210 Training Loss: -443.6129
Negative Log Likelihood: 0.1715 Sigma2 Prior: -443.7850 Regularization: 0.0006
Iter: 6220 Training Loss: -444.4254
Negative Log Likelihood: 0.1286 Sigma2 Prior: -444.5547 Regularization: 0.0006
Iter: 6230 Training Loss: -444.3418
Negative Log Likelihood: 0.1571 Sigma2 Prior: -444.4995 Regularization: 0.0006
Iter: 6240 Training Loss: -445.1131
Negative Log Likelihood: 0.1523 Sigma2 Prior: -445.2660 Regularization: 0.0006
Iter: 6250 Training Loss: -445.5281
Negative Log Likelihood: 0.1875 Sigma2 Prior: -445.7162 Regularization: 0.0006
Iter: 6260 Training Loss: -445.6612
Negative Log Likelihood: 0.1779 Sigma2 Prior: -445.8397 Regularization: 0.0006
Iter: 6270 Training Loss: -446.5279
Negative Log Likelihood: 0.1638 Sigma2 Prior: -446.6923 Regularization: 0.0006
Iter: 6280 Training Loss: -447.0948
Negative Log Likelihood: 0.1609 Sigma2 Prior: -447.2564 Regularization: 0.0006
Iter: 6290 Training Loss: -447.8056
Negative Log Likelihood: 0.1785 Sigma2 Prior: -447.9847 Regularization: 0.0006
Iter: 6300 Training Loss: -448.3734
Negative Log Likelihood: 0.1579 Sigma2 Prior: -448.5319 Regularization: 0.0006
Iter: 6310 Training Loss: -448.7895
Negative Log Likelihood: 0.1693 Sigma2 Prior: -448.9593 Regularization: 0.0006
Iter: 6320 Training Loss: -449.1887
Negative Log Likelihood: 0.1712 Sigma2 Prior: -449.3605 Regularization: 0.0006
Iter: 6330 Training Loss: -449.9241
Negative Log Likelihood: 0.1700 Sigma2 Prior: -450.0947 Regularization: 0.0006
Iter: 6340 Training Loss: -450.3195
Negative Log Likelihood: 0.1767 Sigma2 Prior: -450.4968 Regularization: 0.0006
Iter: 6350 Training Loss: -450.9570
Negative Log Likelihood: 0.1823 Sigma2 Prior: -451.1399 Regularization: 0.0006
Iter: 6360 Training Loss: -451.3958
Negative Log Likelihood: 0.1880 Sigma2 Prior: -451.5844 Regularization: 0.0006
Iter: 6370 Training Loss: -451.8051
Negative Log Likelihood: 0.1899 Sigma2 Prior: -451.9956 Regularization: 0.0006
Iter: 6380 Training Loss: -452.6682
Negative Log Likelihood: 0.1565 Sigma2 Prior: -452.8253 Regularization: 0.0006
Iter: 6390 Training Loss: -452.9471
Negative Log Likelihood: 0.1756 Sigma2 Prior: -453.1233 Regularization: 0.0006
Iter: 6400 Training Loss: -454.2121
Negative Log Likelihood: 0.1634 Sigma2 Prior: -454.3760 Regularization: 0.0006
Iter: 6410 Training Loss: -454.5439
Negative Log Likelihood: 0.1211 Sigma2 Prior: -454.6656 Regularization: 0.0006
Iter: 6420 Training Loss: -455.2838
Negative Log Likelihood: 0.1333 Sigma2 Prior: -455.4178 Regularization: 0.0006
Iter: 6430 Training Loss: -455.6952
Negative Log Likelihood: 0.1757 Sigma2 Prior: -455.8715 Regularization: 0.0006
Iter: 6440 Training Loss: -456.3568
Negative Log Likelihood: 0.1729 Sigma2 Prior: -456.5303 Regularization: 0.0006
Iter: 6450 Training Loss: -456.6137
Negative Log Likelihood: 0.1657 Sigma2 Prior: -456.7801 Regularization: 0.0006
Iter: 6460 Training Loss: -457.1111
Negative Log Likelihood: 0.1961 Sigma2 Prior: -457.3079 Regularization: 0.0006
Iter: 6470 Training Loss: -457.6569
Negative Log Likelihood: 0.1555 Sigma2 Prior: -457.8129 Regularization: 0.0006
Iter: 6480 Training Loss: -458.2468
Negative Log Likelihood: 0.1867 Sigma2 Prior: -458.4341 Regularization: 0.0006
Iter: 6490 Training Loss: -459.0694
Negative Log Likelihood: 0.1855 Sigma2 Prior: -459.2556 Regularization: 0.0006
Iter: 6500 Training Loss: -459.3843
Negative Log Likelihood: 0.1897 Sigma2 Prior: -459.5746 Regularization: 0.0006
Iter: 6510 Training Loss: -460.7740
Negative Log Likelihood: 0.1634 Sigma2 Prior: -460.9380 Regularization: 0.0006
Iter: 6520 Training Loss: -460.7451
Negative Log Likelihood: 0.1650 Sigma2 Prior: -460.9107 Regularization: 0.0006
Iter: 6530 Training Loss: -461.8070
Negative Log Likelihood: 0.1613 Sigma2 Prior: -461.9689 Regularization: 0.0006
Iter: 6540 Training Loss: -462.2428
Negative Log Likelihood: 0.1823 Sigma2 Prior: -462.4257 Regularization: 0.0006
Iter: 6550 Training Loss: -462.7560
Negative Log Likelihood: 0.1650 Sigma2 Prior: -462.9215 Regularization: 0.0006
Iter: 6560 Training Loss: -463.3140
Negative Log Likelihood: 0.2276 Sigma2 Prior: -463.5422 Regularization: 0.0006
Iter: 6570 Training Loss: -463.7608
Negative Log Likelihood: 0.2054 Sigma2 Prior: -463.9668 Regularization: 0.0006
Iter: 6580 Training Loss: -464.7733
Negative Log Likelihood: 0.1615 Sigma2 Prior: -464.9354 Regularization: 0.0006
Iter: 6590 Training Loss: -465.1430
Negative Log Likelihood: 0.1783 Sigma2 Prior: -465.3219 Regularization: 0.0006
Iter: 6600 Training Loss: -465.5764
Negative Log Likelihood: 0.2003 Sigma2 Prior: -465.7773 Regularization: 0.0006
Iter: 6610 Training Loss: -466.4774
Negative Log Likelihood: 0.1797 Sigma2 Prior: -466.6577 Regularization: 0.0006
Iter: 6620 Training Loss: -467.1163
Negative Log Likelihood: 0.1942 Sigma2 Prior: -467.3112 Regularization: 0.0006
Iter: 6630 Training Loss: -468.0131
Negative Log Likelihood: 0.1978 Sigma2 Prior: -468.2115 Regularization: 0.0006
Iter: 6640 Training Loss: -469.4826
Negative Log Likelihood: 0.1445 Sigma2 Prior: -469.6277 Regularization: 0.0006
Iter: 6650 Training Loss: -469.5394
Negative Log Likelihood: 0.1754 Sigma2 Prior: -469.7154 Regularization: 0.0006
Iter: 6660 Training Loss: -469.7236
Negative Log Likelihood: 0.1998 Sigma2 Prior: -469.9241 Regularization: 0.0006
Iter: 6670 Training Loss: -471.1434
Negative Log Likelihood: 0.1443 Sigma2 Prior: -471.2883 Regularization: 0.0006
Iter: 6680 Training Loss: -471.5732
Negative Log Likelihood: 0.1766 Sigma2 Prior: -471.7504 Regularization: 0.0006
Iter: 6690 Training Loss: -471.7253
Negative Log Likelihood: 0.2317 Sigma2 Prior: -471.9576 Regularization: 0.0006
Iter: 6700 Training Loss: -472.5644
Negative Log Likelihood: 0.1752 Sigma2 Prior: -472.7403 Regularization: 0.0006
Iter: 6710 Training Loss: -473.1859
Negative Log Likelihood: 0.2041 Sigma2 Prior: -473.3907 Regularization: 0.0006
Iter: 6720 Training Loss: -473.9294
Negative Log Likelihood: 0.1998 Sigma2 Prior: -474.1298 Regularization: 0.0006
Iter: 6730 Training Loss: -474.7379
Negative Log Likelihood: 0.1893 Sigma2 Prior: -474.9278 Regularization: 0.0006
Iter: 6740 Training Loss: -474.7798
Negative Log Likelihood: 0.2569 Sigma2 Prior: -475.0374 Regularization: 0.0006
Iter: 6750 Training Loss: -476.0565
Negative Log Likelihood: 0.1906 Sigma2 Prior: -476.2477 Regularization: 0.0006
Iter: 6760 Training Loss: -476.5151
Negative Log Likelihood: 0.2066 Sigma2 Prior: -476.7223 Regularization: 0.0006
Iter: 6770 Training Loss: -477.2174
Negative Log Likelihood: 0.2468 Sigma2 Prior: -477.4649 Regularization: 0.0006
Iter: 6780 Training Loss: -477.5540
Negative Log Likelihood: 0.2386 Sigma2 Prior: -477.7932 Regularization: 0.0006
Iter: 6790 Training Loss: -478.4054
Negative Log Likelihood: 0.2150 Sigma2 Prior: -478.6210 Regularization: 0.0006
Iter: 6800 Training Loss: -479.2807
Negative Log Likelihood: 0.2444 Sigma2 Prior: -479.5257 Regularization: 0.0006
Iter: 6810 Training Loss: -480.4616
Negative Log Likelihood: 0.2010 Sigma2 Prior: -480.6631 Regularization: 0.0006
Iter: 6820 Training Loss: -480.9392
Negative Log Likelihood: 0.1874 Sigma2 Prior: -481.1272 Regularization: 0.0006
Iter: 6830 Training Loss: -481.3282
Negative Log Likelihood: 0.2231 Sigma2 Prior: -481.5519 Regularization: 0.0006
Iter: 6840 Training Loss: -482.7109
Negative Log Likelihood: 0.2148 Sigma2 Prior: -482.9263 Regularization: 0.0006
Iter: 6850 Training Loss: -483.6042
Negative Log Likelihood: 0.1806 Sigma2 Prior: -483.7854 Regularization: 0.0006
Iter: 6860 Training Loss: -484.2071
Negative Log Likelihood: 0.2122 Sigma2 Prior: -484.4200 Regularization: 0.0006
Iter: 6870 Training Loss: -484.8364
Negative Log Likelihood: 0.2193 Sigma2 Prior: -485.0562 Regularization: 0.0006
Iter: 6880 Training Loss: -485.5329
Negative Log Likelihood: 0.2024 Sigma2 Prior: -485.7360 Regularization: 0.0006
Iter: 6890 Training Loss: -486.7293
Negative Log Likelihood: 0.2074 Sigma2 Prior: -486.9373 Regularization: 0.0006
Iter: 6900 Training Loss: -487.2579
Negative Log Likelihood: 0.2154 Sigma2 Prior: -487.4739 Regularization: 0.0006
Iter: 6910 Training Loss: -488.0033
Negative Log Likelihood: 0.2049 Sigma2 Prior: -488.2088 Regularization: 0.0006
Iter: 6920 Training Loss: -488.7433
Negative Log Likelihood: 0.2081 Sigma2 Prior: -488.9520 Regularization: 0.0006
Iter: 6930 Training Loss: -489.3177
Negative Log Likelihood: 0.2109 Sigma2 Prior: -489.5292 Regularization: 0.0006
Iter: 6940 Training Loss: -490.4705
Negative Log Likelihood: 0.1914 Sigma2 Prior: -490.6625 Regularization: 0.0006
Iter: 6950 Training Loss: -491.5803
Negative Log Likelihood: 0.1923 Sigma2 Prior: -491.7732 Regularization: 0.0006
Iter: 6960 Training Loss: -492.2954
Negative Log Likelihood: 0.2595 Sigma2 Prior: -492.5555 Regularization: 0.0006
Iter: 6970 Training Loss: -492.4628
Negative Log Likelihood: 0.2400 Sigma2 Prior: -492.7034 Regularization: 0.0006
Iter: 6980 Training Loss: -493.4791
Negative Log Likelihood: 0.2187 Sigma2 Prior: -493.6984 Regularization: 0.0006
Iter: 6990 Training Loss: -493.9047
Negative Log Likelihood: 0.2825 Sigma2 Prior: -494.1878 Regularization: 0.0006
Iter: 7000 Training Loss: -495.1380
Negative Log Likelihood: 0.2072 Sigma2 Prior: -495.3459 Regularization: 0.0006
Iter: 7010 Training Loss: -496.4376
Negative Log Likelihood: 0.2020 Sigma2 Prior: -496.6402 Regularization: 0.0006
Iter: 7020 Training Loss: -497.0912
Negative Log Likelihood: 0.2730 Sigma2 Prior: -497.3648 Regularization: 0.0006
Iter: 7030 Training Loss: -498.0609
Negative Log Likelihood: 0.2145 Sigma2 Prior: -498.2761 Regularization: 0.0006
Iter: 7040 Training Loss: -498.3711
Negative Log Likelihood: 0.2466 Sigma2 Prior: -498.6183 Regularization: 0.0006
Iter: 7050 Training Loss: -499.2642
Negative Log Likelihood: 0.2880 Sigma2 Prior: -499.5529 Regularization: 0.0006
Iter: 7060 Training Loss: -500.4376
Negative Log Likelihood: 0.2481 Sigma2 Prior: -500.6863 Regularization: 0.0006
Iter: 7070 Training Loss: -501.8972
Negative Log Likelihood: 0.2266 Sigma2 Prior: -502.1245 Regularization: 0.0006
Iter: 7080 Training Loss: -502.4260
Negative Log Likelihood: 0.2266 Sigma2 Prior: -502.6532 Regularization: 0.0006
Iter: 7090 Training Loss: -502.9935
Negative Log Likelihood: 0.2749 Sigma2 Prior: -503.2690 Regularization: 0.0006
Iter: 7100 Training Loss: -504.1201
Negative Log Likelihood: 0.2445 Sigma2 Prior: -504.3651 Regularization: 0.0006
Iter: 7110 Training Loss: -505.6293
Negative Log Likelihood: 0.2477 Sigma2 Prior: -505.8776 Regularization: 0.0006
Iter: 7120 Training Loss: -505.6608
Negative Log Likelihood: 0.2652 Sigma2 Prior: -505.9266 Regularization: 0.0006
Iter: 7130 Training Loss: -507.2155
Negative Log Likelihood: 0.2394 Sigma2 Prior: -507.4556 Regularization: 0.0006
Iter: 7140 Training Loss: -507.9835
Negative Log Likelihood: 0.2509 Sigma2 Prior: -508.2350 Regularization: 0.0006
Iter: 7150 Training Loss: -509.5467
Negative Log Likelihood: 0.2051 Sigma2 Prior: -509.7523 Regularization: 0.0006
Iter: 7160 Training Loss: -510.1082
Negative Log Likelihood: 0.2749 Sigma2 Prior: -510.3837 Regularization: 0.0006
Iter: 7170 Training Loss: -510.3143
Negative Log Likelihood: 0.3329 Sigma2 Prior: -510.6478 Regularization: 0.0006
Iter: 7180 Training Loss: -512.2188
Negative Log Likelihood: 0.2763 Sigma2 Prior: -512.4957 Regularization: 0.0006
Iter: 7190 Training Loss: -513.7212
Negative Log Likelihood: 0.2682 Sigma2 Prior: -513.9900 Regularization: 0.0006
Iter: 7200 Training Loss: -513.9326
Negative Log Likelihood: 0.2874 Sigma2 Prior: -514.2206 Regularization: 0.0006
Iter: 7210 Training Loss: -515.0933
Negative Log Likelihood: 0.2527 Sigma2 Prior: -515.3466 Regularization: 0.0006
Iter: 7220 Training Loss: -515.9373
Negative Log Likelihood: 0.2658 Sigma2 Prior: -516.2036 Regularization: 0.0006
Iter: 7230 Training Loss: -516.4247
Negative Log Likelihood: 0.3491 Sigma2 Prior: -516.7745 Regularization: 0.0006
Iter: 7240 Training Loss: -518.0515
Negative Log Likelihood: 0.2652 Sigma2 Prior: -518.3172 Regularization: 0.0006
Iter: 7250 Training Loss: -519.4200
Negative Log Likelihood: 0.2312 Sigma2 Prior: -519.6519 Regularization: 0.0006
Iter: 7260 Training Loss: -519.5917
Negative Log Likelihood: 0.3460 Sigma2 Prior: -519.9384 Regularization: 0.0006
Iter: 7270 Training Loss: -521.1174
Negative Log Likelihood: 0.2672 Sigma2 Prior: -521.3853 Regularization: 0.0006
Iter: 7280 Training Loss: -521.9561
Negative Log Likelihood: 0.2976 Sigma2 Prior: -522.2543 Regularization: 0.0006
Iter: 7290 Training Loss: -523.5284
Negative Log Likelihood: 0.2849 Sigma2 Prior: -523.8139 Regularization: 0.0006
Iter: 7300 Training Loss: -524.1696
Negative Log Likelihood: 0.3357 Sigma2 Prior: -524.5059 Regularization: 0.0006
Iter: 7310 Training Loss: -524.8035
Negative Log Likelihood: 0.3605 Sigma2 Prior: -525.1646 Regularization: 0.0006
Iter: 7320 Training Loss: -526.0834
Negative Log Likelihood: 0.3073 Sigma2 Prior: -526.3913 Regularization: 0.0006
Iter: 7330 Training Loss: -527.9067
Negative Log Likelihood: 0.2586 Sigma2 Prior: -528.1660 Regularization: 0.0006
Iter: 7340 Training Loss: -528.1438
Negative Log Likelihood: 0.3374 Sigma2 Prior: -528.4818 Regularization: 0.0006
Iter: 7350 Training Loss: -530.8792
Negative Log Likelihood: 0.3161 Sigma2 Prior: -531.1960 Regularization: 0.0006
Iter: 7360 Training Loss: -531.4239
Negative Log Likelihood: 0.3570 Sigma2 Prior: -531.7815 Regularization: 0.0006
Iter: 7370 Training Loss: -532.6468
Negative Log Likelihood: 0.3356 Sigma2 Prior: -532.9830 Regularization: 0.0006
Iter: 7380 Training Loss: -534.3989
Negative Log Likelihood: 0.3038 Sigma2 Prior: -534.7034 Regularization: 0.0006
Iter: 7390 Training Loss: -534.4984
Negative Log Likelihood: 0.3946 Sigma2 Prior: -534.8936 Regularization: 0.0006
Iter: 7400 Training Loss: -536.3368
Negative Log Likelihood: 0.3655 Sigma2 Prior: -536.7029 Regularization: 0.0006
Iter: 7410 Training Loss: -538.2786
Negative Log Likelihood: 0.3120 Sigma2 Prior: -538.5913 Regularization: 0.0006
Iter: 7420 Training Loss: -538.0662
Negative Log Likelihood: 0.4357 Sigma2 Prior: -538.5025 Regularization: 0.0006
Iter: 7430 Training Loss: -540.6475
Negative Log Likelihood: 0.3584 Sigma2 Prior: -541.0065 Regularization: 0.0006
Iter: 7440 Training Loss: -541.6620
Negative Log Likelihood: 0.3760 Sigma2 Prior: -542.0386 Regularization: 0.0006
Iter: 7450 Training Loss: -542.7599
Negative Log Likelihood: 0.4405 Sigma2 Prior: -543.2010 Regularization: 0.0006
Iter: 7460 Training Loss: -545.5134
Negative Log Likelihood: 0.2924 Sigma2 Prior: -545.8063 Regularization: 0.0006
Iter: 7470 Training Loss: -545.3436
Negative Log Likelihood: 0.4021 Sigma2 Prior: -545.7464 Regularization: 0.0006
Iter: 7480 Training Loss: -546.7637
Negative Log Likelihood: 0.4023 Sigma2 Prior: -547.1666 Regularization: 0.0006
Iter: 7490 Training Loss: -548.1868
Negative Log Likelihood: 0.3736 Sigma2 Prior: -548.5610 Regularization: 0.0006
Iter: 7500 Training Loss: -550.4422
Negative Log Likelihood: 0.3111 Sigma2 Prior: -550.7540 Regularization: 0.0006
Iter: 7510 Training Loss: -551.5457
Negative Log Likelihood: 0.3787 Sigma2 Prior: -551.9250 Regularization: 0.0006
Iter: 7520 Training Loss: -552.6560
Negative Log Likelihood: 0.4300 Sigma2 Prior: -553.0866 Regularization: 0.0006
Iter: 7530 Training Loss: -554.1078
Negative Log Likelihood: 0.3926 Sigma2 Prior: -554.5011 Regularization: 0.0006
Iter: 7540 Training Loss: -556.5126
Negative Log Likelihood: 0.3516 Sigma2 Prior: -556.8649 Regularization: 0.0006
Iter: 7550 Training Loss: -557.4570
Negative Log Likelihood: 0.4168 Sigma2 Prior: -557.8744 Regularization: 0.0006
Iter: 7560 Training Loss: -559.4796
Negative Log Likelihood: 0.4165 Sigma2 Prior: -559.8967 Regularization: 0.0006
Iter: 7570 Training Loss: -560.6470
Negative Log Likelihood: 0.4352 Sigma2 Prior: -561.0828 Regularization: 0.0006
Iter: 7580 Training Loss: -562.1154
Negative Log Likelihood: 0.4256 Sigma2 Prior: -562.5416 Regularization: 0.0006
Iter: 7590 Training Loss: -565.0687
Negative Log Likelihood: 0.3821 Sigma2 Prior: -565.4514 Regularization: 0.0006
Iter: 7600 Training Loss: -565.4888
Negative Log Likelihood: 0.4872 Sigma2 Prior: -565.9766 Regularization: 0.0006
Iter: 7610 Training Loss: -568.5696
Negative Log Likelihood: 0.4334 Sigma2 Prior: -569.0035 Regularization: 0.0006
Iter: 7620 Training Loss: -569.1240
Negative Log Likelihood: 0.4659 Sigma2 Prior: -569.5905 Regularization: 0.0006
Iter: 7630 Training Loss: -571.7667
Negative Log Likelihood: 0.3821 Sigma2 Prior: -572.1494 Regularization: 0.0006
Iter: 7640 Training Loss: -572.4963
Negative Log Likelihood: 0.4969 Sigma2 Prior: -572.9938 Regularization: 0.0006
Iter: 7650 Training Loss: -574.7007
Negative Log Likelihood: 0.3969 Sigma2 Prior: -575.0982 Regularization: 0.0006
Iter: 7660 Training Loss: -577.3154
Negative Log Likelihood: 0.4244 Sigma2 Prior: -577.7404 Regularization: 0.0006
Iter: 7670 Training Loss: -579.1325
Negative Log Likelihood: 0.4824 Sigma2 Prior: -579.6155 Regularization: 0.0006
Iter: 7680 Training Loss: -581.2591
Negative Log Likelihood: 0.4940 Sigma2 Prior: -581.7537 Regularization: 0.0006
Iter: 7690 Training Loss: -581.7928
Negative Log Likelihood: 0.6302 Sigma2 Prior: -582.4236 Regularization: 0.0006
Iter: 7700 Training Loss: -585.8854
Negative Log Likelihood: 0.4877 Sigma2 Prior: -586.3738 Regularization: 0.0006
Iter: 7710 Training Loss: -586.9726
Negative Log Likelihood: 0.4877 Sigma2 Prior: -587.4609 Regularization: 0.0006
Iter: 7720 Training Loss: -588.2590
Negative Log Likelihood: 0.4859 Sigma2 Prior: -588.7455 Regularization: 0.0006
Iter: 7730 Training Loss: -591.1371
Negative Log Likelihood: 0.5521 Sigma2 Prior: -591.6899 Regularization: 0.0006
Iter: 7740 Training Loss: -593.5674
Negative Log Likelihood: 0.5349 Sigma2 Prior: -594.1028 Regularization: 0.0006
Iter: 7750 Training Loss: -595.7170
Negative Log Likelihood: 0.6083 Sigma2 Prior: -596.3259 Regularization: 0.0006
Iter: 7760 Training Loss: -596.4366
Negative Log Likelihood: 0.6948 Sigma2 Prior: -597.1320 Regularization: 0.0006
Iter: 7770 Training Loss: -599.3409
Negative Log Likelihood: 0.6469 Sigma2 Prior: -599.9884 Regularization: 0.0006
Iter: 7780 Training Loss: -603.0228
Negative Log Likelihood: 0.5981 Sigma2 Prior: -603.6215 Regularization: 0.0006
Iter: 7790 Training Loss: -605.3713
Negative Log Likelihood: 0.5498 Sigma2 Prior: -605.9217 Regularization: 0.0006
Iter: 7800 Training Loss: -609.1797
Negative Log Likelihood: 0.4931 Sigma2 Prior: -609.6735 Regularization: 0.0006
Iter: 7810 Training Loss: -610.2342
Negative Log Likelihood: 0.6868 Sigma2 Prior: -610.9216 Regularization: 0.0006
Iter: 7820 Training Loss: -613.6908
Negative Log Likelihood: 0.4796 Sigma2 Prior: -614.1710 Regularization: 0.0006
Iter: 7830 Training Loss: -616.5332
Negative Log Likelihood: 0.6951 Sigma2 Prior: -617.2289 Regularization: 0.0006
Iter: 7840 Training Loss: -618.2950
Negative Log Likelihood: 0.6093 Sigma2 Prior: -618.9048 Regularization: 0.0006
Iter: 7850 Training Loss: -623.7534
Negative Log Likelihood: 0.5986 Sigma2 Prior: -624.3526 Regularization: 0.0006
Iter: 7860 Training Loss: -624.5707
Negative Log Likelihood: 0.7636 Sigma2 Prior: -625.3349 Regularization: 0.0006
Iter: 7870 Training Loss: -628.5310
Negative Log Likelihood: 0.8783 Sigma2 Prior: -629.4099 Regularization: 0.0006
Iter: 7880 Training Loss: -633.5345
Negative Log Likelihood: 0.7158 Sigma2 Prior: -634.2509 Regularization: 0.0006
Iter: 7890 Training Loss: -634.5817
Negative Log Likelihood: 0.7847 Sigma2 Prior: -635.3670 Regularization: 0.0006
Iter: 7900 Training Loss: -638.7365
Negative Log Likelihood: 1.0058 Sigma2 Prior: -639.7429 Regularization: 0.0006
Iter: 7910 Training Loss: -641.9183
Negative Log Likelihood: 0.9153 Sigma2 Prior: -642.8342 Regularization: 0.0006
Iter: 7920 Training Loss: -646.1539
Negative Log Likelihood: 0.8628 Sigma2 Prior: -647.0173 Regularization: 0.0006
Iter: 7930 Training Loss: -649.1367
Negative Log Likelihood: 0.9446 Sigma2 Prior: -650.0819 Regularization: 0.0006
Iter: 7940 Training Loss: -652.8020
Negative Log Likelihood: 0.9714 Sigma2 Prior: -653.7740 Regularization: 0.0006
Iter: 7950 Training Loss: -657.9605
Negative Log Likelihood: 1.0895 Sigma2 Prior: -659.0506 Regularization: 0.0006
Iter: 7960 Training Loss: -660.2950
Negative Log Likelihood: 1.0911 Sigma2 Prior: -661.3867 Regularization: 0.0006
Iter: 7970 Training Loss: -667.2141
Negative Log Likelihood: 1.2111 Sigma2 Prior: -668.4257 Regularization: 0.0006
Iter: 7980 Training Loss: -666.0321
Negative Log Likelihood: 1.1134 Sigma2 Prior: -667.1461 Regularization: 0.0006
Iter: 7990 Training Loss: -678.0259
Negative Log Likelihood: 1.3894 Sigma2 Prior: -679.4160 Regularization: 0.0006
Iter: 8000 Training Loss: -681.8314
Negative Log Likelihood: 1.4271 Sigma2 Prior: -683.2592 Regularization: 0.0006
Iter: 8010 Training Loss: -684.2035
Negative Log Likelihood: 1.6775 Sigma2 Prior: -685.8817 Regularization: 0.0006
Iter: 8020 Training Loss: -689.0410
Negative Log Likelihood: 1.9578 Sigma2 Prior: -690.9993 Regularization: 0.0006
Iter: 8030 Training Loss: -695.8165
Negative Log Likelihood: 1.3108 Sigma2 Prior: -697.1279 Regularization: 0.0006
Iter: 8040 Training Loss: -704.7280
Negative Log Likelihood: 2.0477 Sigma2 Prior: -706.7764 Regularization: 0.0006
Iter: 8050 Training Loss: -704.0172
Negative Log Likelihood: 2.1545 Sigma2 Prior: -706.1723 Regularization: 0.0006
Iter: 8060 Training Loss: -710.6259
Negative Log Likelihood: 2.1693 Sigma2 Prior: -712.7958 Regularization: 0.0006
Iter: 8070 Training Loss: -716.4705
Negative Log Likelihood: 2.2965 Sigma2 Prior: -718.7676 Regularization: 0.0006
Iter: 8080 Training Loss: -724.4063
Negative Log Likelihood: 2.7770 Sigma2 Prior: -727.1839 Regularization: 0.0006
Iter: 8090 Training Loss: -726.2371
Negative Log Likelihood: 3.1028 Sigma2 Prior: -729.3405 Regularization: 0.0006
Iter: 8100 Training Loss: -720.8698
Negative Log Likelihood: 3.7224 Sigma2 Prior: -724.5928 Regularization: 0.0006
Iter: 8110 Training Loss: -723.1234
Negative Log Likelihood: 3.6126 Sigma2 Prior: -726.7365 Regularization: 0.0006
Iter: 8120 Training Loss: -728.4952
Negative Log Likelihood: 3.4008 Sigma2 Prior: -731.8966 Regularization: 0.0006
Iter: 8130 Training Loss: -719.7737
Negative Log Likelihood: 3.4977 Sigma2 Prior: -723.2720 Regularization: 0.0006
Iter: 8140 Training Loss: -733.0047
Negative Log Likelihood: 3.7670 Sigma2 Prior: -736.7723 Regularization: 0.0006
Iter: 8150 Training Loss: -718.3965
Negative Log Likelihood: 3.8856 Sigma2 Prior: -722.2827 Regularization: 0.0006
Iter: 8160 Training Loss: -719.7645
Negative Log Likelihood: 3.8564 Sigma2 Prior: -723.6216 Regularization: 0.0006
Iter: 8170 Training Loss: -719.3173
Negative Log Likelihood: 4.0552 Sigma2 Prior: -723.3732 Regularization: 0.0006
Iter: 8180 Training Loss: -713.3293
Negative Log Likelihood: 3.5592 Sigma2 Prior: -716.8893 Regularization: 0.0006
Iter: 8190 Training Loss: -730.1866
Negative Log Likelihood: 3.1498 Sigma2 Prior: -733.3372 Regularization: 0.0006
Iter: 8200 Training Loss: -727.2981
Negative Log Likelihood: 3.1460 Sigma2 Prior: -730.4448 Regularization: 0.0006
Iter: 8210 Training Loss: -723.1311
Negative Log Likelihood: 3.6581 Sigma2 Prior: -726.7899 Regularization: 0.0006
Iter: 8220 Training Loss: -719.0230
Negative Log Likelihood: 4.3498 Sigma2 Prior: -723.3734 Regularization: 0.0006
Iter: 8230 Training Loss: -722.8386
Negative Log Likelihood: 3.5482 Sigma2 Prior: -726.3875 Regularization: 0.0006
Iter: 8240 Training Loss: -714.7620
Negative Log Likelihood: 4.4841 Sigma2 Prior: -719.2468 Regularization: 0.0006
Iter: 8250 Training Loss: -732.4680
Negative Log Likelihood: 2.6019 Sigma2 Prior: -735.0706 Regularization: 0.0006
Iter: 8260 Training Loss: -729.0623
Negative Log Likelihood: 3.0745 Sigma2 Prior: -732.1375 Regularization: 0.0006
Iter: 8270 Training Loss: -726.3767
Negative Log Likelihood: 3.3853 Sigma2 Prior: -729.7627 Regularization: 0.0006
Iter: 8280 Training Loss: -724.2406
Negative Log Likelihood: 3.1855 Sigma2 Prior: -727.4268 Regularization: 0.0006
Iter: 8290 Training Loss: -714.7601
Negative Log Likelihood: 4.1029 Sigma2 Prior: -718.8637 Regularization: 0.0006
Iter: 8300 Training Loss: -716.2152
Negative Log Likelihood: 3.8232 Sigma2 Prior: -720.0391 Regularization: 0.0006
Iter: 8310 Training Loss: -712.0018
Negative Log Likelihood: 4.2479 Sigma2 Prior: -716.2503 Regularization: 0.0006
Iter: 8320 Training Loss: -719.8954
Negative Log Likelihood: 4.1409 Sigma2 Prior: -724.0370 Regularization: 0.0006
Iter: 8330 Training Loss: -731.3882
Negative Log Likelihood: 3.3558 Sigma2 Prior: -734.7447 Regularization: 0.0006
Iter: 8340 Training Loss: -722.7627
Negative Log Likelihood: 3.3937 Sigma2 Prior: -726.1570 Regularization: 0.0007
Iter: 8350 Training Loss: -718.9942
Negative Log Likelihood: 4.2085 Sigma2 Prior: -723.2034 Regularization: 0.0007
Iter: 8360 Training Loss: -717.6591
Negative Log Likelihood: 3.9180 Sigma2 Prior: -721.5777 Regularization: 0.0007
Iter: 8370 Training Loss: -724.9601
Negative Log Likelihood: 3.6181 Sigma2 Prior: -728.5789 Regularization: 0.0007
Iter: 8380 Training Loss: -716.1368
Negative Log Likelihood: 3.6495 Sigma2 Prior: -719.7870 Regularization: 0.0007
Iter: 8390 Training Loss: -722.9307
Negative Log Likelihood: 3.6155 Sigma2 Prior: -726.5469 Regularization: 0.0007
Iter: 8400 Training Loss: -731.5214
Negative Log Likelihood: 3.1035 Sigma2 Prior: -734.6255 Regularization: 0.0007
Iter: 8410 Training Loss: -729.8466
Negative Log Likelihood: 2.9229 Sigma2 Prior: -732.7702 Regularization: 0.0007
Iter: 8420 Training Loss: -724.0089
Negative Log Likelihood: 3.8031 Sigma2 Prior: -727.8127 Regularization: 0.0007
Iter: 8430 Training Loss: -714.6552
Negative Log Likelihood: 4.4104 Sigma2 Prior: -719.0662 Regularization: 0.0007
Iter: 8440 Training Loss: -721.2751
Negative Log Likelihood: 3.7415 Sigma2 Prior: -725.0172 Regularization: 0.0007
Iter: 8450 Training Loss: -719.5909
Negative Log Likelihood: 3.5343 Sigma2 Prior: -723.1259 Regularization: 0.0007
Iter: 8460 Training Loss: -727.0185
Negative Log Likelihood: 3.4468 Sigma2 Prior: -730.4659 Regularization: 0.0007
Iter: 8470 Training Loss: -728.1290
Negative Log Likelihood: 2.7247 Sigma2 Prior: -730.8544 Regularization: 0.0007
Iter: 8480 Training Loss: -728.9178
Negative Log Likelihood: 3.7045 Sigma2 Prior: -732.6229 Regularization: 0.0007
Iter: 8490 Training Loss: -726.9268
Negative Log Likelihood: 3.9085 Sigma2 Prior: -730.8359 Regularization: 0.0007
Iter: 8500 Training Loss: -725.3859
Negative Log Likelihood: 3.8413 Sigma2 Prior: -729.2279 Regularization: 0.0007
Iter: 8510 Training Loss: -738.1962
Negative Log Likelihood: 3.2718 Sigma2 Prior: -741.4688 Regularization: 0.0007
Iter: 8520 Training Loss: -723.6218
Negative Log Likelihood: 3.7442 Sigma2 Prior: -727.3666 Regularization: 0.0007
Iter: 8530 Training Loss: -726.9059
Negative Log Likelihood: 3.7413 Sigma2 Prior: -730.6479 Regularization: 0.0007
Iter: 8540 Training Loss: -722.7018
Negative Log Likelihood: 2.8001 Sigma2 Prior: -725.5026 Regularization: 0.0007
Iter: 8550 Training Loss: -723.7197
Negative Log Likelihood: 3.7663 Sigma2 Prior: -727.4867 Regularization: 0.0007
Iter: 8560 Training Loss: -729.8442
Negative Log Likelihood: 3.3861 Sigma2 Prior: -733.2311 Regularization: 0.0007
Iter: 8570 Training Loss: -725.7440
Negative Log Likelihood: 4.1059 Sigma2 Prior: -729.8505 Regularization: 0.0007
Iter: 8580 Training Loss: -725.7966
Negative Log Likelihood: 3.6054 Sigma2 Prior: -729.4026 Regularization: 0.0007
Iter: 8590 Training Loss: -733.5155
Negative Log Likelihood: 2.8459 Sigma2 Prior: -736.3621 Regularization: 0.0007
Iter: 8600 Training Loss: -723.3002
Negative Log Likelihood: 4.1818 Sigma2 Prior: -727.4827 Regularization: 0.0007
Iter: 8610 Training Loss: -723.7338
Negative Log Likelihood: 3.5514 Sigma2 Prior: -727.2859 Regularization: 0.0007
Iter: 8620 Training Loss: -720.8397
Negative Log Likelihood: 3.2784 Sigma2 Prior: -724.1187 Regularization: 0.0007
Iter: 8630 Training Loss: -723.1185
Negative Log Likelihood: 3.4288 Sigma2 Prior: -726.5480 Regularization: 0.0007
Iter: 8640 Training Loss: -732.9208
Negative Log Likelihood: 3.1147 Sigma2 Prior: -736.0363 Regularization: 0.0007
Iter: 8650 Training Loss: -725.8146
Negative Log Likelihood: 3.5567 Sigma2 Prior: -729.3719 Regularization: 0.0007
Iter: 8660 Training Loss: -729.2296
Negative Log Likelihood: 3.8910 Sigma2 Prior: -733.1213 Regularization: 0.0007
Iter: 8670 Training Loss: -733.8488
Negative Log Likelihood: 3.6756 Sigma2 Prior: -737.5250 Regularization: 0.0007
Iter: 8680 Training Loss: -733.3966
Negative Log Likelihood: 2.8372 Sigma2 Prior: -736.2345 Regularization: 0.0007
Iter: 8690 Training Loss: -728.1109
Negative Log Likelihood: 4.2785 Sigma2 Prior: -732.3901 Regularization: 0.0007
Iter: 8700 Training Loss: -728.8820
Negative Log Likelihood: 3.6434 Sigma2 Prior: -732.5261 Regularization: 0.0007
Iter: 8710 Training Loss: -729.1183
Negative Log Likelihood: 3.0863 Sigma2 Prior: -732.2053 Regularization: 0.0007
Iter: 8720 Training Loss: -712.6801
Negative Log Likelihood: 4.5119 Sigma2 Prior: -717.1926 Regularization: 0.0007
Iter: 8730 Training Loss: -727.0815
Negative Log Likelihood: 3.3988 Sigma2 Prior: -730.4811 Regularization: 0.0007
Iter: 8740 Training Loss: -719.4509
Negative Log Likelihood: 3.9226 Sigma2 Prior: -723.3742 Regularization: 0.0007
Iter: 8750 Training Loss: -720.8710
Negative Log Likelihood: 4.0671 Sigma2 Prior: -724.9387 Regularization: 0.0007
Iter: 8760 Training Loss: -720.2423
Negative Log Likelihood: 3.3821 Sigma2 Prior: -723.6251 Regularization: 0.0007
Iter: 8770 Training Loss: -726.3210
Negative Log Likelihood: 3.1061 Sigma2 Prior: -729.4278 Regularization: 0.0007
Iter: 8780 Training Loss: -723.3291
Negative Log Likelihood: 3.7880 Sigma2 Prior: -727.1178 Regularization: 0.0007
Iter: 8790 Training Loss: -719.5295
Negative Log Likelihood: 3.8396 Sigma2 Prior: -723.3698 Regularization: 0.0007
Iter: 8800 Training Loss: -721.6561
Negative Log Likelihood: 4.0904 Sigma2 Prior: -725.7472 Regularization: 0.0007
Iter: 8810 Training Loss: -726.1734
Negative Log Likelihood: 3.9571 Sigma2 Prior: -730.1312 Regularization: 0.0007
Iter: 8820 Training Loss: -721.7119
Negative Log Likelihood: 3.3807 Sigma2 Prior: -725.0933 Regularization: 0.0007
Iter: 8830 Training Loss: -725.8817
Negative Log Likelihood: 4.1460 Sigma2 Prior: -730.0284 Regularization: 0.0007
Iter: 8840 Training Loss: -716.5242
Negative Log Likelihood: 4.1438 Sigma2 Prior: -720.6686 Regularization: 0.0007
Iter: 8850 Training Loss: -723.9176
Negative Log Likelihood: 3.9466 Sigma2 Prior: -727.8649 Regularization: 0.0007
Iter: 8860 Training Loss: -726.4005
Negative Log Likelihood: 2.8471 Sigma2 Prior: -729.2483 Regularization: 0.0007
Iter: 8870 Training Loss: -730.9080
Negative Log Likelihood: 3.7470 Sigma2 Prior: -734.6556 Regularization: 0.0007
Iter: 8880 Training Loss: -726.5134
Negative Log Likelihood: 3.3230 Sigma2 Prior: -729.8370 Regularization: 0.0007
Iter: 8890 Training Loss: -726.4610
Negative Log Likelihood: 3.2903 Sigma2 Prior: -729.7520 Regularization: 0.0007
Iter: 8900 Training Loss: -724.5261
Negative Log Likelihood: 3.6900 Sigma2 Prior: -728.2167 Regularization: 0.0007
Iter: 8910 Training Loss: -728.2583
Negative Log Likelihood: 4.0155 Sigma2 Prior: -732.2745 Regularization: 0.0007
Iter: 8920 Training Loss: -726.8621
Negative Log Likelihood: 3.4761 Sigma2 Prior: -730.3389 Regularization: 0.0007
Iter: 8930 Training Loss: -714.6229
Negative Log Likelihood: 3.9182 Sigma2 Prior: -718.5417 Regularization: 0.0007
Iter: 8940 Training Loss: -725.6716
Negative Log Likelihood: 3.0508 Sigma2 Prior: -728.7230 Regularization: 0.0007
Iter: 8950 Training Loss: -726.3629
Negative Log Likelihood: 3.6837 Sigma2 Prior: -730.0472 Regularization: 0.0007
Iter: 8960 Training Loss: -730.8778
Negative Log Likelihood: 3.3618 Sigma2 Prior: -734.2403 Regularization: 0.0007
Iter: 8970 Training Loss: -717.5760
Negative Log Likelihood: 3.6383 Sigma2 Prior: -721.2150 Regularization: 0.0007
Iter: 8980 Training Loss: -721.4421
Negative Log Likelihood: 3.3323 Sigma2 Prior: -724.7750 Regularization: 0.0007
Iter: 8990 Training Loss: -729.6745
Negative Log Likelihood: 3.9095 Sigma2 Prior: -733.5847 Regularization: 0.0007
Iter: 9000 Training Loss: -736.0467
Negative Log Likelihood: 2.6645 Sigma2 Prior: -738.7118 Regularization: 0.0007
Iter: 9010 Training Loss: -735.6461
Negative Log Likelihood: 3.4537 Sigma2 Prior: -739.1005 Regularization: 0.0007
Iter: 9020 Training Loss: -723.2973
Negative Log Likelihood: 2.8589 Sigma2 Prior: -726.1569 Regularization: 0.0007
Iter: 9030 Training Loss: -724.4771
Negative Log Likelihood: 3.4209 Sigma2 Prior: -727.8987 Regularization: 0.0007
Iter: 9040 Training Loss: -733.3508
Negative Log Likelihood: 3.2060 Sigma2 Prior: -736.5576 Regularization: 0.0007
Iter: 9050 Training Loss: -717.8715
Negative Log Likelihood: 4.2240 Sigma2 Prior: -722.0963 Regularization: 0.0007
Iter: 9060 Training Loss: -726.1614
Negative Log Likelihood: 3.6407 Sigma2 Prior: -729.8028 Regularization: 0.0007
Iter: 9070 Training Loss: -717.1057
Negative Log Likelihood: 3.9735 Sigma2 Prior: -721.0798 Regularization: 0.0007
Iter: 9080 Training Loss: -724.9637
Negative Log Likelihood: 3.0055 Sigma2 Prior: -727.9698 Regularization: 0.0007
Iter: 9090 Training Loss: -724.5847
Negative Log Likelihood: 3.3897 Sigma2 Prior: -727.9750 Regularization: 0.0007
Iter: 9100 Training Loss: -720.2960
Negative Log Likelihood: 4.3165 Sigma2 Prior: -724.6132 Regularization: 0.0007
Iter: 9110 Training Loss: -715.5127
Negative Log Likelihood: 3.8981 Sigma2 Prior: -719.4115 Regularization: 0.0007
Iter: 9120 Training Loss: -732.9843
Negative Log Likelihood: 2.2531 Sigma2 Prior: -735.2380 Regularization: 0.0007
Iter: 9130 Training Loss: -738.4504
Negative Log Likelihood: 3.3080 Sigma2 Prior: -741.7591 Regularization: 0.0007
Iter: 9140 Training Loss: -723.3779
Negative Log Likelihood: 3.0295 Sigma2 Prior: -726.4081 Regularization: 0.0007
Iter: 9150 Training Loss: -723.4379
Negative Log Likelihood: 3.2820 Sigma2 Prior: -726.7205 Regularization: 0.0007
Iter: 9160 Training Loss: -725.7991
Negative Log Likelihood: 3.2597 Sigma2 Prior: -729.0594 Regularization: 0.0007
Iter: 9170 Training Loss: -719.1531
Negative Log Likelihood: 3.7250 Sigma2 Prior: -722.8788 Regularization: 0.0007
Iter: 9180 Training Loss: -716.4726
Negative Log Likelihood: 4.1819 Sigma2 Prior: -720.6552 Regularization: 0.0007
Iter: 9190 Training Loss: -721.1940
Negative Log Likelihood: 3.7420 Sigma2 Prior: -724.9366 Regularization: 0.0007
Iter: 9200 Training Loss: -722.8075
Negative Log Likelihood: 4.0576 Sigma2 Prior: -726.8657 Regularization: 0.0007
Iter: 9210 Training Loss: -717.7043
Negative Log Likelihood: 4.0551 Sigma2 Prior: -721.7600 Regularization: 0.0007
Iter: 9220 Training Loss: -725.6787
Negative Log Likelihood: 3.7425 Sigma2 Prior: -729.4219 Regularization: 0.0007
Iter: 9230 Training Loss: -722.1591
Negative Log Likelihood: 4.3870 Sigma2 Prior: -726.5468 Regularization: 0.0007
Iter: 9240 Training Loss: -734.3487
Negative Log Likelihood: 2.8692 Sigma2 Prior: -737.2186 Regularization: 0.0007
Iter: 9250 Training Loss: -729.3573
Negative Log Likelihood: 3.2031 Sigma2 Prior: -732.5611 Regularization: 0.0007
Iter: 9260 Training Loss: -731.9407
Negative Log Likelihood: 3.6293 Sigma2 Prior: -735.5707 Regularization: 0.0007
Iter: 9270 Training Loss: -719.8581
Negative Log Likelihood: 3.8917 Sigma2 Prior: -723.7504 Regularization: 0.0007
Iter: 9280 Training Loss: -728.0004
Negative Log Likelihood: 3.9761 Sigma2 Prior: -731.9772 Regularization: 0.0007
Iter: 9290 Training Loss: -726.0537
Negative Log Likelihood: 3.5670 Sigma2 Prior: -729.6214 Regularization: 0.0007
Iter: 9300 Training Loss: -716.9839
Negative Log Likelihood: 4.0456 Sigma2 Prior: -721.0303 Regularization: 0.0007
Iter: 9310 Training Loss: -721.4866
Negative Log Likelihood: 3.2566 Sigma2 Prior: -724.7439 Regularization: 0.0007
Iter: 9320 Training Loss: -727.2327
Negative Log Likelihood: 3.7584 Sigma2 Prior: -730.9918 Regularization: 0.0007
Iter: 9330 Training Loss: -723.7473
Negative Log Likelihood: 3.4565 Sigma2 Prior: -727.2045 Regularization: 0.0007
Iter: 9340 Training Loss: -720.3311
Negative Log Likelihood: 2.8337 Sigma2 Prior: -723.1655 Regularization: 0.0007
Iter: 9350 Training Loss: -732.8484
Negative Log Likelihood: 3.6375 Sigma2 Prior: -736.4866 Regularization: 0.0007
Iter: 9360 Training Loss: -727.8186
Negative Log Likelihood: 3.9055 Sigma2 Prior: -731.7247 Regularization: 0.0007
Iter: 9370 Training Loss: -720.3519
Negative Log Likelihood: 3.1798 Sigma2 Prior: -723.5323 Regularization: 0.0007
Iter: 9380 Training Loss: -720.5510
Negative Log Likelihood: 3.8985 Sigma2 Prior: -724.4502 Regularization: 0.0007
Iter: 9390 Training Loss: -733.0930
Negative Log Likelihood: 3.1862 Sigma2 Prior: -736.2798 Regularization: 0.0007
Iter: 9400 Training Loss: -725.5613
Negative Log Likelihood: 3.1680 Sigma2 Prior: -728.7300 Regularization: 0.0007
Iter: 9410 Training Loss: -725.6384
Negative Log Likelihood: 3.3273 Sigma2 Prior: -728.9664 Regularization: 0.0007
Iter: 9420 Training Loss: -728.6053
Negative Log Likelihood: 3.5656 Sigma2 Prior: -732.1715 Regularization: 0.0007
Iter: 9430 Training Loss: -723.3250
Negative Log Likelihood: 2.9065 Sigma2 Prior: -726.2322 Regularization: 0.0007
Iter: 9440 Training Loss: -725.1874
Negative Log Likelihood: 3.1423 Sigma2 Prior: -728.3303 Regularization: 0.0007
Iter: 9450 Training Loss: -723.4876
Negative Log Likelihood: 3.6745 Sigma2 Prior: -727.1628 Regularization: 0.0007
Iter: 9460 Training Loss: -722.7814
Negative Log Likelihood: 3.5038 Sigma2 Prior: -726.2859 Regularization: 0.0007
Iter: 9470 Training Loss: -722.2147
Negative Log Likelihood: 3.9076 Sigma2 Prior: -726.1230 Regularization: 0.0007
Iter: 9480 Training Loss: -725.7460
Negative Log Likelihood: 3.7513 Sigma2 Prior: -729.4980 Regularization: 0.0007
Iter: 9490 Training Loss: -725.7047
Negative Log Likelihood: 3.6020 Sigma2 Prior: -729.3074 Regularization: 0.0007
Iter: 9500 Training Loss: -723.1334
Negative Log Likelihood: 4.1575 Sigma2 Prior: -727.2916 Regularization: 0.0007
Iter: 9510 Training Loss: -732.8911
Negative Log Likelihood: 3.6147 Sigma2 Prior: -736.5064 Regularization: 0.0007
Iter: 9520 Training Loss: -722.1536
Negative Log Likelihood: 4.1723 Sigma2 Prior: -726.3266 Regularization: 0.0007
Iter: 9530 Training Loss: -724.6255
Negative Log Likelihood: 4.2225 Sigma2 Prior: -728.8486 Regularization: 0.0007
Iter: 9540 Training Loss: -736.0342
Negative Log Likelihood: 2.5284 Sigma2 Prior: -738.5632 Regularization: 0.0007
Iter: 9550 Training Loss: -718.6934
Negative Log Likelihood: 4.0685 Sigma2 Prior: -722.7625 Regularization: 0.0007
Iter: 9560 Training Loss: -724.2827
Negative Log Likelihood: 3.8500 Sigma2 Prior: -728.1334 Regularization: 0.0007
Iter: 9570 Training Loss: -716.8318
Negative Log Likelihood: 3.9757 Sigma2 Prior: -720.8082 Regularization: 0.0007
Iter: 9580 Training Loss: -732.3461
Negative Log Likelihood: 2.9635 Sigma2 Prior: -735.3104 Regularization: 0.0007
Iter: 9590 Training Loss: -725.3162
Negative Log Likelihood: 3.8630 Sigma2 Prior: -729.1799 Regularization: 0.0007
Iter: 9600 Training Loss: -725.2659
Negative Log Likelihood: 3.0420 Sigma2 Prior: -728.3085 Regularization: 0.0007
Iter: 9610 Training Loss: -725.2468
Negative Log Likelihood: 3.3954 Sigma2 Prior: -728.6428 Regularization: 0.0007
Iter: 9620 Training Loss: -724.9108
Negative Log Likelihood: 3.4949 Sigma2 Prior: -728.4064 Regularization: 0.0007
Iter: 9630 Training Loss: -734.1586
Negative Log Likelihood: 3.4497 Sigma2 Prior: -737.6090 Regularization: 0.0007
Iter: 9640 Training Loss: -729.0236
Negative Log Likelihood: 3.7260 Sigma2 Prior: -732.7502 Regularization: 0.0007
Iter: 9650 Training Loss: -720.0087
Negative Log Likelihood: 3.7571 Sigma2 Prior: -723.7665 Regularization: 0.0007
Iter: 9660 Training Loss: -726.0304
Negative Log Likelihood: 3.1960 Sigma2 Prior: -729.2271 Regularization: 0.0007
Iter: 9670 Training Loss: -722.8585
Negative Log Likelihood: 3.8536 Sigma2 Prior: -726.7128 Regularization: 0.0007
Iter: 9680 Training Loss: -720.5919
Negative Log Likelihood: 2.7820 Sigma2 Prior: -723.3746 Regularization: 0.0007
Iter: 9690 Training Loss: -724.2744
Negative Log Likelihood: 3.5505 Sigma2 Prior: -727.8256 Regularization: 0.0007
Iter: 9700 Training Loss: -723.8167
Negative Log Likelihood: 4.0863 Sigma2 Prior: -727.9037 Regularization: 0.0007
Iter: 9710 Training Loss: -716.5340
Negative Log Likelihood: 3.4036 Sigma2 Prior: -719.9383 Regularization: 0.0007
Iter: 9720 Training Loss: -728.5602
Negative Log Likelihood: 3.4706 Sigma2 Prior: -732.0315 Regularization: 0.0007
Iter: 9730 Training Loss: -721.7927
Negative Log Likelihood: 3.0529 Sigma2 Prior: -724.8463 Regularization: 0.0007
Iter: 9740 Training Loss: -726.1206
Negative Log Likelihood: 3.4827 Sigma2 Prior: -729.6040 Regularization: 0.0007
Iter: 9750 Training Loss: -721.5478
Negative Log Likelihood: 3.8809 Sigma2 Prior: -725.4293 Regularization: 0.0007
Iter: 9760 Training Loss: -719.5406
Negative Log Likelihood: 3.4701 Sigma2 Prior: -723.0114 Regularization: 0.0007
Iter: 9770 Training Loss: -729.9786
Negative Log Likelihood: 3.8140 Sigma2 Prior: -733.7933 Regularization: 0.0007
Iter: 9780 Training Loss: -730.5014
Negative Log Likelihood: 3.7914 Sigma2 Prior: -734.2935 Regularization: 0.0007
Iter: 9790 Training Loss: -724.1288
Negative Log Likelihood: 3.7275 Sigma2 Prior: -727.8569 Regularization: 0.0007
Iter: 9800 Training Loss: -721.3828
Negative Log Likelihood: 3.3073 Sigma2 Prior: -724.6908 Regularization: 0.0007
Iter: 9810 Training Loss: -729.8482
Negative Log Likelihood: 3.6370 Sigma2 Prior: -733.4858 Regularization: 0.0007
Iter: 9820 Training Loss: -731.4291
Negative Log Likelihood: 3.2815 Sigma2 Prior: -734.7113 Regularization: 0.0007
Iter: 9830 Training Loss: -720.8963
Negative Log Likelihood: 3.6322 Sigma2 Prior: -724.5292 Regularization: 0.0007
Iter: 9840 Training Loss: -720.8702
Negative Log Likelihood: 3.4084 Sigma2 Prior: -724.2794 Regularization: 0.0007
Iter: 9850 Training Loss: -715.0273
Negative Log Likelihood: 4.3720 Sigma2 Prior: -719.4000 Regularization: 0.0007
Iter: 9860 Training Loss: -726.3394
Negative Log Likelihood: 3.1724 Sigma2 Prior: -729.5125 Regularization: 0.0007
Iter: 9870 Training Loss: -727.1762
Negative Log Likelihood: 3.0080 Sigma2 Prior: -730.1849 Regularization: 0.0007
Iter: 9880 Training Loss: -724.4603
Negative Log Likelihood: 3.7881 Sigma2 Prior: -728.2491 Regularization: 0.0007
Iter: 9890 Training Loss: -720.6857
Negative Log Likelihood: 3.4933 Sigma2 Prior: -724.1797 Regularization: 0.0007
Iter: 9900 Training Loss: -725.6106
Negative Log Likelihood: 4.2384 Sigma2 Prior: -729.8497 Regularization: 0.0007
Iter: 9910 Training Loss: -728.0574
Negative Log Likelihood: 3.5420 Sigma2 Prior: -731.6001 Regularization: 0.0007
Iter: 9920 Training Loss: -725.8070
Negative Log Likelihood: 3.3495 Sigma2 Prior: -729.1572 Regularization: 0.0007
Iter: 9930 Training Loss: -716.8621
Negative Log Likelihood: 3.4562 Sigma2 Prior: -720.3189 Regularization: 0.0007
Iter: 9940 Training Loss: -724.5054
Negative Log Likelihood: 3.5533 Sigma2 Prior: -728.0594 Regularization: 0.0007
Iter: 9950 Training Loss: -729.5893
Negative Log Likelihood: 3.5393 Sigma2 Prior: -733.1292 Regularization: 0.0007
Iter: 9960 Training Loss: -727.4670
Negative Log Likelihood: 3.7205 Sigma2 Prior: -731.1882 Regularization: 0.0007
Iter: 9970 Training Loss: -725.0367
Negative Log Likelihood: 3.6519 Sigma2 Prior: -728.6893 Regularization: 0.0007
Iter: 9980 Training Loss: -726.0872
Negative Log Likelihood: 3.8917 Sigma2 Prior: -729.9796 Regularization: 0.0007
Iter: 9990 Training Loss: -740.3272
Negative Log Likelihood: 2.6441 Sigma2 Prior: -742.9720 Regularization: 0.0007
Iter: 10000 Training Loss: -727.2979
Negative Log Likelihood: 3.2782 Sigma2 Prior: -730.5768 Regularization: 0.0007
Iter: 10010 Training Loss: -718.1138
Negative Log Likelihood: 3.8670 Sigma2 Prior: -721.9814 Regularization: 0.0007
Iter: 10020 Training Loss: -724.4476
Negative Log Likelihood: 3.5304 Sigma2 Prior: -727.9786 Regularization: 0.0007
Iter: 10030 Training Loss: -724.0555
Negative Log Likelihood: 3.6668 Sigma2 Prior: -727.7230 Regularization: 0.0007
Iter: 10040 Training Loss: -719.2390
Negative Log Likelihood: 3.5541 Sigma2 Prior: -722.7938 Regularization: 0.0007
Iter: 10050 Training Loss: -724.9733
Negative Log Likelihood: 3.8683 Sigma2 Prior: -728.8423 Regularization: 0.0007
Iter: 10060 Training Loss: -726.0820
Negative Log Likelihood: 3.9256 Sigma2 Prior: -730.0083 Regularization: 0.0007
Iter: 10070 Training Loss: -719.5292
Negative Log Likelihood: 3.1260 Sigma2 Prior: -722.6559 Regularization: 0.0007
Iter: 10080 Training Loss: -709.7963
Negative Log Likelihood: 4.3016 Sigma2 Prior: -714.0986 Regularization: 0.0007
Iter: 10090 Training Loss: -713.2458
Negative Log Likelihood: 4.1739 Sigma2 Prior: -717.4205 Regularization: 0.0007
Iter: 10100 Training Loss: -728.4518
Negative Log Likelihood: 3.5805 Sigma2 Prior: -732.0330 Regularization: 0.0007
Iter: 10110 Training Loss: -712.9708
Negative Log Likelihood: 4.0427 Sigma2 Prior: -717.0142 Regularization: 0.0007
Iter: 10120 Training Loss: -721.4266
Negative Log Likelihood: 4.0046 Sigma2 Prior: -725.4319 Regularization: 0.0007
Iter: 10130 Training Loss: -720.1992
Negative Log Likelihood: 3.6577 Sigma2 Prior: -723.8576 Regularization: 0.0007
Iter: 10140 Training Loss: -726.4006
Negative Log Likelihood: 3.4631 Sigma2 Prior: -729.8643 Regularization: 0.0007
Iter: 10150 Training Loss: -720.7289
Negative Log Likelihood: 3.3852 Sigma2 Prior: -724.1149 Regularization: 0.0007
Iter: 10160 Training Loss: -728.5375
Negative Log Likelihood: 3.7071 Sigma2 Prior: -732.2453 Regularization: 0.0007
Iter: 10170 Training Loss: -718.9753
Negative Log Likelihood: 3.2946 Sigma2 Prior: -722.2706 Regularization: 0.0007
Iter: 10180 Training Loss: -721.2607
Negative Log Likelihood: 4.0826 Sigma2 Prior: -725.3441 Regularization: 0.0007
Iter: 10190 Training Loss: -718.8504
Negative Log Likelihood: 3.9322 Sigma2 Prior: -722.7834 Regularization: 0.0007
Iter: 10200 Training Loss: -714.0603
Negative Log Likelihood: 4.1179 Sigma2 Prior: -718.1790 Regularization: 0.0007
Iter: 10210 Training Loss: -725.5002
Negative Log Likelihood: 3.8197 Sigma2 Prior: -729.3206 Regularization: 0.0007
Iter: 10220 Training Loss: -724.6713
Negative Log Likelihood: 3.7889 Sigma2 Prior: -728.4609 Regularization: 0.0007
Iter: 10230 Training Loss: -718.6456
Negative Log Likelihood: 3.5925 Sigma2 Prior: -722.2389 Regularization: 0.0007
Iter: 10240 Training Loss: -721.5312
Negative Log Likelihood: 4.2753 Sigma2 Prior: -725.8073 Regularization: 0.0007
Iter: 10250 Training Loss: -728.7753
Negative Log Likelihood: 3.0053 Sigma2 Prior: -731.7813 Regularization: 0.0007
Iter: 10260 Training Loss: -722.3220
Negative Log Likelihood: 3.4146 Sigma2 Prior: -725.7374 Regularization: 0.0007
Iter: 10270 Training Loss: -718.8031
Negative Log Likelihood: 3.5740 Sigma2 Prior: -722.3778 Regularization: 0.0007
Iter: 10280 Training Loss: -725.2946
Negative Log Likelihood: 3.5248 Sigma2 Prior: -728.8201 Regularization: 0.0007
Iter: 10290 Training Loss: -725.4212
Negative Log Likelihood: 3.1816 Sigma2 Prior: -728.6035 Regularization: 0.0007
Iter: 10300 Training Loss: -738.0357
Negative Log Likelihood: 2.5912 Sigma2 Prior: -740.6277 Regularization: 0.0007
Iter: 10310 Training Loss: -723.5413
Negative Log Likelihood: 3.8308 Sigma2 Prior: -727.3727 Regularization: 0.0007
Iter: 10320 Training Loss: -731.8154
Negative Log Likelihood: 3.6517 Sigma2 Prior: -735.4678 Regularization: 0.0007
Iter: 10330 Training Loss: -727.9903
Negative Log Likelihood: 3.7024 Sigma2 Prior: -731.6934 Regularization: 0.0007
Iter: 10340 Training Loss: -727.0129
Negative Log Likelihood: 3.6200 Sigma2 Prior: -730.6337 Regularization: 0.0007
Iter: 10350 Training Loss: -725.9701
Negative Log Likelihood: 3.7713 Sigma2 Prior: -729.7421 Regularization: 0.0007
Iter: 10360 Training Loss: -723.6090
Negative Log Likelihood: 3.0301 Sigma2 Prior: -726.6398 Regularization: 0.0007
Iter: 10370 Training Loss: -719.5997
Negative Log Likelihood: 4.5950 Sigma2 Prior: -724.1954 Regularization: 0.0007
Iter: 10380 Training Loss: -725.5078
Negative Log Likelihood: 3.3775 Sigma2 Prior: -728.8860 Regularization: 0.0007
Iter: 10390 Training Loss: -733.6243
Negative Log Likelihood: 3.0463 Sigma2 Prior: -736.6714 Regularization: 0.0007
Iter: 10400 Training Loss: -725.4222
Negative Log Likelihood: 3.8167 Sigma2 Prior: -729.2396 Regularization: 0.0007
Iter: 10410 Training Loss: -730.0896
Negative Log Likelihood: 2.7805 Sigma2 Prior: -732.8708 Regularization: 0.0007
Iter: 10420 Training Loss: -724.8170
Negative Log Likelihood: 4.3834 Sigma2 Prior: -729.2010 Regularization: 0.0007
Iter: 10430 Training Loss: -733.3442
Negative Log Likelihood: 3.1879 Sigma2 Prior: -736.5328 Regularization: 0.0007
Iter: 10440 Training Loss: -721.4923
Negative Log Likelihood: 3.8559 Sigma2 Prior: -725.3490 Regularization: 0.0007
Iter: 10450 Training Loss: -723.7133
Negative Log Likelihood: 3.5308 Sigma2 Prior: -727.2449 Regularization: 0.0007
Iter: 10460 Training Loss: -727.4074
Negative Log Likelihood: 3.6655 Sigma2 Prior: -731.0737 Regularization: 0.0007
Iter: 10470 Training Loss: -726.9879
Negative Log Likelihood: 3.1633 Sigma2 Prior: -730.1520 Regularization: 0.0007
Iter: 10480 Training Loss: -724.1072
Negative Log Likelihood: 3.1066 Sigma2 Prior: -727.2146 Regularization: 0.0007
Iter: 10490 Training Loss: -726.4529
Negative Log Likelihood: 3.3471 Sigma2 Prior: -729.8007 Regularization: 0.0007
Iter: 10500 Training Loss: -719.0605
Negative Log Likelihood: 3.7335 Sigma2 Prior: -722.7948 Regularization: 0.0007
Iter: 10510 Training Loss: -727.8539
Negative Log Likelihood: 3.4388 Sigma2 Prior: -731.2935 Regularization: 0.0007
Iter: 10520 Training Loss: -725.9730
Negative Log Likelihood: 3.1774 Sigma2 Prior: -729.1511 Regularization: 0.0007
Iter: 10530 Training Loss: -737.8246
Negative Log Likelihood: 2.8897 Sigma2 Prior: -740.7151 Regularization: 0.0007
Iter: 10540 Training Loss: -725.4466
Negative Log Likelihood: 4.1656 Sigma2 Prior: -729.6129 Regularization: 0.0007
Iter: 10550 Training Loss: -734.5900
Negative Log Likelihood: 2.8972 Sigma2 Prior: -737.4880 Regularization: 0.0007
Iter: 10560 Training Loss: -730.5712
Negative Log Likelihood: 3.6825 Sigma2 Prior: -734.2545 Regularization: 0.0007
Iter: 10570 Training Loss: -721.5588
Negative Log Likelihood: 4.1166 Sigma2 Prior: -725.6762 Regularization: 0.0007
Iter: 10580 Training Loss: -725.9005
Negative Log Likelihood: 3.5532 Sigma2 Prior: -729.4545 Regularization: 0.0007
Iter: 10590 Training Loss: -726.6443
Negative Log Likelihood: 3.3574 Sigma2 Prior: -730.0025 Regularization: 0.0007
Iter: 10600 Training Loss: -728.4094
Negative Log Likelihood: 3.3053 Sigma2 Prior: -731.7154 Regularization: 0.0007
Iter: 10610 Training Loss: -728.6306
Negative Log Likelihood: 3.3059 Sigma2 Prior: -731.9372 Regularization: 0.0007
Iter: 10620 Training Loss: -724.7853
Negative Log Likelihood: 3.4398 Sigma2 Prior: -728.2259 Regularization: 0.0007
Iter: 10630 Training Loss: -723.4591
Negative Log Likelihood: 4.1308 Sigma2 Prior: -727.5906 Regularization: 0.0007
Iter: 10640 Training Loss: -718.8426
Negative Log Likelihood: 3.8179 Sigma2 Prior: -722.6612 Regularization: 0.0007
Iter: 10650 Training Loss: -721.6413
Negative Log Likelihood: 4.0229 Sigma2 Prior: -725.6649 Regularization: 0.0007
Iter: 10660 Training Loss: -723.3512
Negative Log Likelihood: 3.8513 Sigma2 Prior: -727.2032 Regularization: 0.0007
Iter: 10670 Training Loss: -736.6938
Negative Log Likelihood: 2.7488 Sigma2 Prior: -739.4433 Regularization: 0.0007
Iter: 10680 Training Loss: -725.1272
Negative Log Likelihood: 3.5287 Sigma2 Prior: -728.6567 Regularization: 0.0007
Iter: 10690 Training Loss: -719.7219
Negative Log Likelihood: 3.6536 Sigma2 Prior: -723.3762 Regularization: 0.0007
Iter: 10700 Training Loss: -720.1913
Negative Log Likelihood: 3.9057 Sigma2 Prior: -724.0978 Regularization: 0.0007
Iter: 10710 Training Loss: -732.2683
Negative Log Likelihood: 3.1465 Sigma2 Prior: -735.4156 Regularization: 0.0007
Iter: 10720 Training Loss: -724.3561
Negative Log Likelihood: 3.9756 Sigma2 Prior: -728.3325 Regularization: 0.0007
Iter: 10730 Training Loss: -724.5911
Negative Log Likelihood: 3.9336 Sigma2 Prior: -728.5254 Regularization: 0.0007
Iter: 10740 Training Loss: -732.6777
Negative Log Likelihood: 3.6826 Sigma2 Prior: -736.3610 Regularization: 0.0007
Iter: 10750 Training Loss: -719.4905
Negative Log Likelihood: 3.7164 Sigma2 Prior: -723.2077 Regularization: 0.0007
Iter: 10760 Training Loss: -732.9082
Negative Log Likelihood: 3.8016 Sigma2 Prior: -736.7106 Regularization: 0.0007
Iter: 10770 Training Loss: -715.8447
Negative Log Likelihood: 4.4069 Sigma2 Prior: -720.2524 Regularization: 0.0007
Iter: 10780 Training Loss: -726.1692
Negative Log Likelihood: 3.2154 Sigma2 Prior: -729.3853 Regularization: 0.0007
Iter: 10790 Training Loss: -725.0111
Negative Log Likelihood: 3.3893 Sigma2 Prior: -728.4011 Regularization: 0.0007
Iter: 10800 Training Loss: -721.4257
Negative Log Likelihood: 3.9939 Sigma2 Prior: -725.4203 Regularization: 0.0007
Iter: 10810 Training Loss: -728.8203
Negative Log Likelihood: 3.5668 Sigma2 Prior: -732.3879 Regularization: 0.0007
Iter: 10820 Training Loss: -730.3879
Negative Log Likelihood: 4.0469 Sigma2 Prior: -734.4355 Regularization: 0.0007
Iter: 10830 Training Loss: -729.4758
Negative Log Likelihood: 3.6778 Sigma2 Prior: -733.1543 Regularization: 0.0007
Iter: 10840 Training Loss: -734.1310
Negative Log Likelihood: 2.8688 Sigma2 Prior: -737.0005 Regularization: 0.0007
Iter: 10850 Training Loss: -723.8502
Negative Log Likelihood: 4.3214 Sigma2 Prior: -728.1724 Regularization: 0.0007
Iter: 10860 Training Loss: -725.5198
Negative Log Likelihood: 3.7556 Sigma2 Prior: -729.2761 Regularization: 0.0007
Iter: 10870 Training Loss: -717.2499
Negative Log Likelihood: 4.1653 Sigma2 Prior: -721.4159 Regularization: 0.0007
Iter: 10880 Training Loss: -723.7548
Negative Log Likelihood: 3.1468 Sigma2 Prior: -726.9022 Regularization: 0.0007
Iter: 10890 Training Loss: -735.4520
Negative Log Likelihood: 3.0168 Sigma2 Prior: -738.4695 Regularization: 0.0007
Iter: 10900 Training Loss: -723.0867
Negative Log Likelihood: 3.4428 Sigma2 Prior: -726.5302 Regularization: 0.0007
Iter: 10910 Training Loss: -723.1951
Negative Log Likelihood: 3.9329 Sigma2 Prior: -727.1287 Regularization: 0.0007
Iter: 10920 Training Loss: -725.9384
Negative Log Likelihood: 3.4523 Sigma2 Prior: -729.3914 Regularization: 0.0007
Iter: 10930 Training Loss: -715.9345
Negative Log Likelihood: 4.3795 Sigma2 Prior: -720.3147 Regularization: 0.0007
Iter: 10940 Training Loss: -722.5620
Negative Log Likelihood: 4.3181 Sigma2 Prior: -726.8807 Regularization: 0.0007
Iter: 10950 Training Loss: -724.0541
Negative Log Likelihood: 3.3371 Sigma2 Prior: -727.3918 Regularization: 0.0007
Iter: 10960 Training Loss: -722.4634
Negative Log Likelihood: 3.6161 Sigma2 Prior: -726.0803 Regularization: 0.0007
Iter: 10970 Training Loss: -717.2453
Negative Log Likelihood: 3.8965 Sigma2 Prior: -721.1425 Regularization: 0.0007
Iter: 10980 Training Loss: -730.4927
Negative Log Likelihood: 3.5617 Sigma2 Prior: -734.0552 Regularization: 0.0007
Iter: 10990 Training Loss: -722.1556
Negative Log Likelihood: 3.6833 Sigma2 Prior: -725.8397 Regularization: 0.0007
Iter: 11000 Training Loss: -723.1324
Negative Log Likelihood: 2.9499 Sigma2 Prior: -726.0831 Regularization: 0.0007
Iter: 11010 Training Loss: -718.8967
Negative Log Likelihood: 3.2594 Sigma2 Prior: -722.1568 Regularization: 0.0007
Iter: 11020 Training Loss: -734.0650
Negative Log Likelihood: 2.6198 Sigma2 Prior: -736.6855 Regularization: 0.0007
Iter: 11030 Training Loss: -726.0809
Negative Log Likelihood: 3.3188 Sigma2 Prior: -729.4005 Regularization: 0.0007
Iter: 11040 Training Loss: -724.6642
Negative Log Likelihood: 3.2075 Sigma2 Prior: -727.8724 Regularization: 0.0007
Iter: 11050 Training Loss: -716.8912
Negative Log Likelihood: 4.3148 Sigma2 Prior: -721.2068 Regularization: 0.0007
Iter: 11060 Training Loss: -721.9763
Negative Log Likelihood: 3.3708 Sigma2 Prior: -725.3478 Regularization: 0.0007
Iter: 11070 Training Loss: -723.6354
Negative Log Likelihood: 3.5383 Sigma2 Prior: -727.1744 Regularization: 0.0007
Iter: 11080 Training Loss: -721.9682
Negative Log Likelihood: 3.4639 Sigma2 Prior: -725.4328 Regularization: 0.0007
Iter: 11090 Training Loss: -732.9855
Negative Log Likelihood: 2.8516 Sigma2 Prior: -735.8378 Regularization: 0.0007
Iter: 11100 Training Loss: -742.4525
Negative Log Likelihood: 2.8928 Sigma2 Prior: -745.3460 Regularization: 0.0007
Iter: 11110 Training Loss: -728.5088
Negative Log Likelihood: 3.1514 Sigma2 Prior: -731.6609 Regularization: 0.0007
Iter: 11120 Training Loss: -718.7303
Negative Log Likelihood: 3.6661 Sigma2 Prior: -722.3972 Regularization: 0.0007
Iter: 11130 Training Loss: -725.0136
Negative Log Likelihood: 3.6679 Sigma2 Prior: -728.6823 Regularization: 0.0007
Iter: 11140 Training Loss: -734.4451
Negative Log Likelihood: 4.1041 Sigma2 Prior: -738.5499 Regularization: 0.0007
Iter: 11150 Training Loss: -719.8755
Negative Log Likelihood: 3.9726 Sigma2 Prior: -723.8488 Regularization: 0.0007
Iter: 11160 Training Loss: -727.9293
Negative Log Likelihood: 3.2797 Sigma2 Prior: -731.2098 Regularization: 0.0007
Iter: 11170 Training Loss: -740.4604
Negative Log Likelihood: 2.7500 Sigma2 Prior: -743.2111 Regularization: 0.0007
Iter: 11180 Training Loss: -720.7499
Negative Log Likelihood: 3.5232 Sigma2 Prior: -724.2739 Regularization: 0.0007
Iter: 11190 Training Loss: -723.5646
Negative Log Likelihood: 3.6493 Sigma2 Prior: -727.2146 Regularization: 0.0007
Iter: 11200 Training Loss: -720.8013
Negative Log Likelihood: 4.0588 Sigma2 Prior: -724.8608 Regularization: 0.0007
Iter: 11210 Training Loss: -720.9618
Negative Log Likelihood: 3.1541 Sigma2 Prior: -724.1166 Regularization: 0.0007
Iter: 11220 Training Loss: -717.6655
Negative Log Likelihood: 3.7850 Sigma2 Prior: -721.4512 Regularization: 0.0007
Iter: 11230 Training Loss: -728.1641
Negative Log Likelihood: 3.4649 Sigma2 Prior: -731.6296 Regularization: 0.0007
Iter: 11240 Training Loss: -723.8792
Negative Log Likelihood: 3.8050 Sigma2 Prior: -727.6849 Regularization: 0.0007
Iter: 11250 Training Loss: -732.8906
Negative Log Likelihood: 3.9802 Sigma2 Prior: -736.8716 Regularization: 0.0007
Iter: 11260 Training Loss: -726.4902
Negative Log Likelihood: 3.9571 Sigma2 Prior: -730.4480 Regularization: 0.0007
Iter: 11270 Training Loss: -732.6851
Negative Log Likelihood: 2.6780 Sigma2 Prior: -735.3639 Regularization: 0.0007
Iter: 11280 Training Loss: -720.5417
Negative Log Likelihood: 3.7435 Sigma2 Prior: -724.2859 Regularization: 0.0007
Iter: 11290 Training Loss: -715.5421
Negative Log Likelihood: 4.4159 Sigma2 Prior: -719.9587 Regularization: 0.0007
Iter: 11300 Training Loss: -736.3328
Negative Log Likelihood: 3.4913 Sigma2 Prior: -739.8248 Regularization: 0.0007
Iter: 11310 Training Loss: -727.3286
Negative Log Likelihood: 3.7634 Sigma2 Prior: -731.0927 Regularization: 0.0007
Iter: 11320 Training Loss: -730.8631
Negative Log Likelihood: 3.1519 Sigma2 Prior: -734.0157 Regularization: 0.0007
Iter: 11330 Training Loss: -725.2914
Negative Log Likelihood: 3.2985 Sigma2 Prior: -728.5907 Regularization: 0.0007
Iter: 11340 Training Loss: -723.7383
Negative Log Likelihood: 3.4380 Sigma2 Prior: -727.1770 Regularization: 0.0007
Iter: 11350 Training Loss: -720.3536
Negative Log Likelihood: 3.9298 Sigma2 Prior: -724.2842 Regularization: 0.0007
Iter: 11360 Training Loss: -726.1701
Negative Log Likelihood: 3.4922 Sigma2 Prior: -729.6631 Regularization: 0.0007
Iter: 11370 Training Loss: -730.1727
Negative Log Likelihood: 3.4785 Sigma2 Prior: -733.6519 Regularization: 0.0007
Iter: 11380 Training Loss: -729.1838
Negative Log Likelihood: 4.5844 Sigma2 Prior: -733.7690 Regularization: 0.0007
Iter: 11390 Training Loss: -721.2579
Negative Log Likelihood: 4.0094 Sigma2 Prior: -725.2679 Regularization: 0.0007
Iter: 11400 Training Loss: -717.4869
Negative Log Likelihood: 3.7121 Sigma2 Prior: -721.1997 Regularization: 0.0007
Iter: 11410 Training Loss: -727.3435
Negative Log Likelihood: 3.6548 Sigma2 Prior: -730.9990 Regularization: 0.0007
Iter: 11420 Training Loss: -726.1559
Negative Log Likelihood: 3.5055 Sigma2 Prior: -729.6622 Regularization: 0.0007
Iter: 11430 Training Loss: -724.0927
Negative Log Likelihood: 3.6646 Sigma2 Prior: -727.7580 Regularization: 0.0007
Iter: 11440 Training Loss: -719.1567
Negative Log Likelihood: 3.5528 Sigma2 Prior: -722.7102 Regularization: 0.0007
Iter: 11450 Training Loss: -722.9302
Negative Log Likelihood: 3.3606 Sigma2 Prior: -726.2916 Regularization: 0.0007
Iter: 11460 Training Loss: -728.7939
Negative Log Likelihood: 3.5377 Sigma2 Prior: -732.3323 Regularization: 0.0007
Iter: 11470 Training Loss: -721.2991
Negative Log Likelihood: 4.1073 Sigma2 Prior: -725.4072 Regularization: 0.0007
Iter: 11480 Training Loss: -721.3260
Negative Log Likelihood: 4.1827 Sigma2 Prior: -725.5094 Regularization: 0.0007
Iter: 11490 Training Loss: -730.8026
Negative Log Likelihood: 2.9667 Sigma2 Prior: -733.7700 Regularization: 0.0007
Iter: 11500 Training Loss: -726.8010
Negative Log Likelihood: 3.5532 Sigma2 Prior: -730.3550 Regularization: 0.0007
Iter: 11510 Training Loss: -717.4944
Negative Log Likelihood: 3.4660 Sigma2 Prior: -720.9611 Regularization: 0.0007
Iter: 11520 Training Loss: -723.2391
Negative Log Likelihood: 3.5627 Sigma2 Prior: -726.8025 Regularization: 0.0007
Iter: 11530 Training Loss: -714.2032
Negative Log Likelihood: 3.9084 Sigma2 Prior: -718.1124 Regularization: 0.0007
Iter: 11540 Training Loss: -727.6398
Negative Log Likelihood: 3.7093 Sigma2 Prior: -731.3497 Regularization: 0.0007
Iter: 11550 Training Loss: -725.4863
Negative Log Likelihood: 2.9455 Sigma2 Prior: -728.4324 Regularization: 0.0007
Iter: 11560 Training Loss: -733.2775
Negative Log Likelihood: 2.5513 Sigma2 Prior: -735.8295 Regularization: 0.0007
Iter: 11570 Training Loss: -733.9064
Negative Log Likelihood: 3.8174 Sigma2 Prior: -737.7245 Regularization: 0.0007
Iter: 11580 Training Loss: -727.9059
Negative Log Likelihood: 3.5066 Sigma2 Prior: -731.4133 Regularization: 0.0007
Iter: 11590 Training Loss: -721.2504
Negative Log Likelihood: 3.9788 Sigma2 Prior: -725.2299 Regularization: 0.0007
Iter: 11600 Training Loss: -723.7093
Negative Log Likelihood: 3.5035 Sigma2 Prior: -727.2136 Regularization: 0.0007
Iter: 11610 Training Loss: -727.1531
Negative Log Likelihood: 3.0761 Sigma2 Prior: -730.2299 Regularization: 0.0007
Iter: 11620 Training Loss: -722.4530
Negative Log Likelihood: 3.8612 Sigma2 Prior: -726.3149 Regularization: 0.0007
Iter: 11630 Training Loss: -725.2933
Negative Log Likelihood: 3.0143 Sigma2 Prior: -728.3083 Regularization: 0.0007
Iter: 11640 Training Loss: -712.4042
Negative Log Likelihood: 3.4329 Sigma2 Prior: -715.8379 Regularization: 0.0007
Iter: 11650 Training Loss: -732.0275
Negative Log Likelihood: 2.8922 Sigma2 Prior: -734.9204 Regularization: 0.0007
Iter: 11660 Training Loss: -720.9942
Negative Log Likelihood: 3.3703 Sigma2 Prior: -724.3652 Regularization: 0.0007
Iter: 11670 Training Loss: -720.1180
Negative Log Likelihood: 3.5587 Sigma2 Prior: -723.6774 Regularization: 0.0007
Iter: 11680 Training Loss: -721.2602
Negative Log Likelihood: 3.8425 Sigma2 Prior: -725.1035 Regularization: 0.0007
Iter: 11690 Training Loss: -723.1206
Negative Log Likelihood: 4.0635 Sigma2 Prior: -727.1848 Regularization: 0.0007
Iter: 11700 Training Loss: -724.3268
Negative Log Likelihood: 3.0069 Sigma2 Prior: -727.3345 Regularization: 0.0007
Iter: 11710 Training Loss: -730.3642
Negative Log Likelihood: 3.3856 Sigma2 Prior: -733.7505 Regularization: 0.0007
Iter: 11720 Training Loss: -735.2692
Negative Log Likelihood: 2.6144 Sigma2 Prior: -737.8843 Regularization: 0.0007
Iter: 11730 Training Loss: -720.1454
Negative Log Likelihood: 4.3025 Sigma2 Prior: -724.4487 Regularization: 0.0007
Iter: 11740 Training Loss: -722.0204
Negative Log Likelihood: 3.4903 Sigma2 Prior: -725.5115 Regularization: 0.0007
Iter: 11750 Training Loss: -727.7188
Negative Log Likelihood: 3.0708 Sigma2 Prior: -730.7903 Regularization: 0.0007
Iter: 11760 Training Loss: -730.1622
Negative Log Likelihood: 3.1173 Sigma2 Prior: -733.2803 Regularization: 0.0007
Iter: 11770 Training Loss: -733.8693
Negative Log Likelihood: 3.1131 Sigma2 Prior: -736.9832 Regularization: 0.0007
Iter: 11780 Training Loss: -736.2175
Negative Log Likelihood: 2.9330 Sigma2 Prior: -739.1512 Regularization: 0.0007
Iter: 11790 Training Loss: -718.4807
Negative Log Likelihood: 4.0622 Sigma2 Prior: -722.5437 Regularization: 0.0007
Iter: 11800 Training Loss: -728.6300
Negative Log Likelihood: 3.2852 Sigma2 Prior: -731.9160 Regularization: 0.0007
Iter: 11810 Training Loss: -731.3795
Negative Log Likelihood: 3.7941 Sigma2 Prior: -735.1743 Regularization: 0.0007
Iter: 11820 Training Loss: -721.7729
Negative Log Likelihood: 3.9032 Sigma2 Prior: -725.6769 Regularization: 0.0007
Iter: 11830 Training Loss: -723.5189
Negative Log Likelihood: 4.0169 Sigma2 Prior: -727.5365 Regularization: 0.0007
Iter: 11840 Training Loss: -719.8615
Negative Log Likelihood: 3.8321 Sigma2 Prior: -723.6944 Regularization: 0.0007
Iter: 11850 Training Loss: -727.3082
Negative Log Likelihood: 3.4311 Sigma2 Prior: -730.7401 Regularization: 0.0007
Iter: 11860 Training Loss: -725.0432
Negative Log Likelihood: 3.7604 Sigma2 Prior: -728.8044 Regularization: 0.0007
Iter: 11870 Training Loss: -720.1313
Negative Log Likelihood: 3.2442 Sigma2 Prior: -723.3763 Regularization: 0.0007
Iter: 11880 Training Loss: -730.4922
Negative Log Likelihood: 3.0197 Sigma2 Prior: -733.5128 Regularization: 0.0007
Iter: 11890 Training Loss: -735.7405
Negative Log Likelihood: 3.6269 Sigma2 Prior: -739.3681 Regularization: 0.0007
Iter: 11900 Training Loss: -717.7742
Negative Log Likelihood: 3.5529 Sigma2 Prior: -721.3278 Regularization: 0.0007
Iter: 11910 Training Loss: -715.2881
Negative Log Likelihood: 3.2206 Sigma2 Prior: -718.5094 Regularization: 0.0007
Iter: 11920 Training Loss: -722.4545
Negative Log Likelihood: 4.0696 Sigma2 Prior: -726.5248 Regularization: 0.0007
Iter: 11930 Training Loss: -717.4108
Negative Log Likelihood: 3.6488 Sigma2 Prior: -721.0604 Regularization: 0.0007
Iter: 11940 Training Loss: -716.4324
Negative Log Likelihood: 3.9100 Sigma2 Prior: -720.3431 Regularization: 0.0007
Iter: 11950 Training Loss: -722.6810
Negative Log Likelihood: 3.0769 Sigma2 Prior: -725.7587 Regularization: 0.0007
Iter: 11960 Training Loss: -716.4747
Negative Log Likelihood: 3.5706 Sigma2 Prior: -720.0460 Regularization: 0.0007
Iter: 11970 Training Loss: -733.0828
Negative Log Likelihood: 3.0311 Sigma2 Prior: -736.1146 Regularization: 0.0007
Iter: 11980 Training Loss: -727.2045
Negative Log Likelihood: 2.6730 Sigma2 Prior: -729.8782 Regularization: 0.0007
Iter: 11990 Training Loss: -727.2311
Negative Log Likelihood: 3.2397 Sigma2 Prior: -730.4715 Regularization: 0.0007
Iter: 12000 Training Loss: -731.8968
Negative Log Likelihood: 3.3090 Sigma2 Prior: -735.2065 Regularization: 0.0007
Iter: 12010 Training Loss: -723.5838
Negative Log Likelihood: 3.3748 Sigma2 Prior: -726.9594 Regularization: 0.0007
Iter: 12020 Training Loss: -716.3494
Negative Log Likelihood: 3.7493 Sigma2 Prior: -720.0994 Regularization: 0.0007
Iter: 12030 Training Loss: -725.0980
Negative Log Likelihood: 3.3433 Sigma2 Prior: -728.4421 Regularization: 0.0007
Iter: 12040 Training Loss: -722.1426
Negative Log Likelihood: 3.9997 Sigma2 Prior: -726.1431 Regularization: 0.0007
Iter: 12050 Training Loss: -716.7245
Negative Log Likelihood: 4.1031 Sigma2 Prior: -720.8282 Regularization: 0.0007
Iter: 12060 Training Loss: -720.8275
Negative Log Likelihood: 4.3583 Sigma2 Prior: -725.1865 Regularization: 0.0007
Iter: 12070 Training Loss: -723.0863
Negative Log Likelihood: 3.8827 Sigma2 Prior: -726.9698 Regularization: 0.0007
Iter: 12080 Training Loss: -722.3713
Negative Log Likelihood: 3.3852 Sigma2 Prior: -725.7572 Regularization: 0.0007
Iter: 12090 Training Loss: -728.3086
Negative Log Likelihood: 3.5316 Sigma2 Prior: -731.8409 Regularization: 0.0007
Iter: 12100 Training Loss: -732.3024
Negative Log Likelihood: 3.8339 Sigma2 Prior: -736.1370 Regularization: 0.0007
Iter: 12110 Training Loss: -723.8431
Negative Log Likelihood: 3.7279 Sigma2 Prior: -727.5717 Regularization: 0.0007
Iter: 12120 Training Loss: -728.1387
Negative Log Likelihood: 3.3599 Sigma2 Prior: -731.4994 Regularization: 0.0007
Iter: 12130 Training Loss: -724.9336
Negative Log Likelihood: 4.3751 Sigma2 Prior: -729.3094 Regularization: 0.0007
Iter: 12140 Training Loss: -728.0689
Negative Log Likelihood: 3.6630 Sigma2 Prior: -731.7326 Regularization: 0.0007
Iter: 12150 Training Loss: -717.8053
Negative Log Likelihood: 3.6882 Sigma2 Prior: -721.4943 Regularization: 0.0007
Iter: 12160 Training Loss: -724.9531
Negative Log Likelihood: 3.8525 Sigma2 Prior: -728.8063 Regularization: 0.0007
Iter: 12170 Training Loss: -722.9216
Negative Log Likelihood: 3.3148 Sigma2 Prior: -726.2371 Regularization: 0.0007
Iter: 12180 Training Loss: -721.5402
Negative Log Likelihood: 3.4830 Sigma2 Prior: -725.0239 Regularization: 0.0007
Iter: 12190 Training Loss: -734.5779
Negative Log Likelihood: 3.6010 Sigma2 Prior: -738.1796 Regularization: 0.0008
Iter: 12200 Training Loss: -731.0343
Negative Log Likelihood: 3.5825 Sigma2 Prior: -734.6176 Regularization: 0.0008
Iter: 12210 Training Loss: -721.4479
Negative Log Likelihood: 3.7385 Sigma2 Prior: -725.1872 Regularization: 0.0008
Iter: 12220 Training Loss: -717.9145
Negative Log Likelihood: 4.5830 Sigma2 Prior: -722.4982 Regularization: 0.0008
Iter: 12230 Training Loss: -726.1851
Negative Log Likelihood: 2.9693 Sigma2 Prior: -729.1551 Regularization: 0.0008
Iter: 12240 Training Loss: -732.2782
Negative Log Likelihood: 2.8934 Sigma2 Prior: -735.1723 Regularization: 0.0008
Iter: 12250 Training Loss: -723.0125
Negative Log Likelihood: 2.9460 Sigma2 Prior: -725.9592 Regularization: 0.0008
Iter: 12260 Training Loss: -731.1917
Negative Log Likelihood: 3.2202 Sigma2 Prior: -734.4126 Regularization: 0.0008
Iter: 12270 Training Loss: -728.3868
Negative Log Likelihood: 4.1979 Sigma2 Prior: -732.5854 Regularization: 0.0008
Iter: 12280 Training Loss: -725.1721
Negative Log Likelihood: 3.6233 Sigma2 Prior: -728.7962 Regularization: 0.0008
Iter: 12290 Training Loss: -715.6885
Negative Log Likelihood: 3.7245 Sigma2 Prior: -719.4137 Regularization: 0.0008
Iter: 12300 Training Loss: -725.4508
Negative Log Likelihood: 2.9653 Sigma2 Prior: -728.4168 Regularization: 0.0008
Iter: 12310 Training Loss: -725.0273
Negative Log Likelihood: 3.4933 Sigma2 Prior: -728.5213 Regularization: 0.0008
Iter: 12320 Training Loss: -719.2714
Negative Log Likelihood: 4.4980 Sigma2 Prior: -723.7701 Regularization: 0.0008
Iter: 12330 Training Loss: -717.1721
Negative Log Likelihood: 3.1587 Sigma2 Prior: -720.3315 Regularization: 0.0008
Iter: 12340 Training Loss: -724.0481
Negative Log Likelihood: 3.6886 Sigma2 Prior: -727.7374 Regularization: 0.0008
Iter: 12350 Training Loss: -727.9741
Negative Log Likelihood: 3.6770 Sigma2 Prior: -731.6517 Regularization: 0.0008
Iter: 12360 Training Loss: -729.4282
Negative Log Likelihood: 3.6357 Sigma2 Prior: -733.0646 Regularization: 0.0008
Iter: 12370 Training Loss: -727.9307
Negative Log Likelihood: 2.9680 Sigma2 Prior: -730.8994 Regularization: 0.0008
Iter: 12380 Training Loss: -724.0037
Negative Log Likelihood: 3.0652 Sigma2 Prior: -727.0696 Regularization: 0.0008
Iter: 12390 Training Loss: -727.3815
Negative Log Likelihood: 3.7717 Sigma2 Prior: -731.1539 Regularization: 0.0008
Iter: 12400 Training Loss: -725.4344
Negative Log Likelihood: 3.5861 Sigma2 Prior: -729.0212 Regularization: 0.0008
Iter: 12410 Training Loss: -725.8948
Negative Log Likelihood: 3.5845 Sigma2 Prior: -729.4800 Regularization: 0.0008
Iter: 12420 Training Loss: -722.3905
Negative Log Likelihood: 3.9889 Sigma2 Prior: -726.3802 Regularization: 0.0008
Iter: 12430 Training Loss: -719.6921
Negative Log Likelihood: 3.7644 Sigma2 Prior: -723.4573 Regularization: 0.0008
Iter: 12440 Training Loss: -723.5373
Negative Log Likelihood: 3.7319 Sigma2 Prior: -727.2699 Regularization: 0.0008
Iter: 12450 Training Loss: -731.0194
Negative Log Likelihood: 3.1839 Sigma2 Prior: -734.2040 Regularization: 0.0008
Iter: 12460 Training Loss: -725.4217
Negative Log Likelihood: 3.7075 Sigma2 Prior: -729.1299 Regularization: 0.0008
Iter: 12470 Training Loss: -727.9122
Negative Log Likelihood: 3.3797 Sigma2 Prior: -731.2926 Regularization: 0.0008
Iter: 12480 Training Loss: -724.4738
Negative Log Likelihood: 3.9634 Sigma2 Prior: -728.4379 Regularization: 0.0008
Iter: 12490 Training Loss: -726.7108
Negative Log Likelihood: 3.5377 Sigma2 Prior: -730.2493 Regularization: 0.0008
Iter: 12500 Training Loss: -721.9248
Negative Log Likelihood: 3.6706 Sigma2 Prior: -725.5961 Regularization: 0.0008
Iter: 12510 Training Loss: -728.6495
Negative Log Likelihood: 3.7813 Sigma2 Prior: -732.4316 Regularization: 0.0008
Iter: 12520 Training Loss: -732.4242
Negative Log Likelihood: 3.2376 Sigma2 Prior: -735.6625 Regularization: 0.0008
Iter: 12530 Training Loss: -717.7624
Negative Log Likelihood: 3.6429 Sigma2 Prior: -721.4060 Regularization: 0.0008
Iter: 12540 Training Loss: -728.2366
Negative Log Likelihood: 3.2929 Sigma2 Prior: -731.5302 Regularization: 0.0008
Iter: 12550 Training Loss: -722.7364
Negative Log Likelihood: 4.0472 Sigma2 Prior: -726.7844 Regularization: 0.0008
Iter: 12560 Training Loss: -730.4366
Negative Log Likelihood: 3.3044 Sigma2 Prior: -733.7417 Regularization: 0.0008
Iter: 12570 Training Loss: -731.6115
Negative Log Likelihood: 3.5226 Sigma2 Prior: -735.1348 Regularization: 0.0008
Iter: 12580 Training Loss: -728.0327
Negative Log Likelihood: 3.0025 Sigma2 Prior: -731.0360 Regularization: 0.0008
Iter: 12590 Training Loss: -714.4795
Negative Log Likelihood: 3.9632 Sigma2 Prior: -718.4435 Regularization: 0.0008
Iter: 12600 Training Loss: -727.5946
Negative Log Likelihood: 3.8376 Sigma2 Prior: -731.4330 Regularization: 0.0008
Iter: 12610 Training Loss: -715.7381
Negative Log Likelihood: 3.4001 Sigma2 Prior: -719.1389 Regularization: 0.0008
Iter: 12620 Training Loss: -730.2787
Negative Log Likelihood: 3.4888 Sigma2 Prior: -733.7682 Regularization: 0.0008
Iter: 12630 Training Loss: -714.5306
Negative Log Likelihood: 3.5735 Sigma2 Prior: -718.1049 Regularization: 0.0008
Iter: 12640 Training Loss: -727.3030
Negative Log Likelihood: 3.3388 Sigma2 Prior: -730.6425 Regularization: 0.0008
Iter: 12650 Training Loss: -725.6312
Negative Log Likelihood: 3.5305 Sigma2 Prior: -729.1625 Regularization: 0.0008
Iter: 12660 Training Loss: -707.8747
Negative Log Likelihood: 4.3320 Sigma2 Prior: -712.2074 Regularization: 0.0008
Iter: 12670 Training Loss: -721.5729
Negative Log Likelihood: 3.6893 Sigma2 Prior: -725.2629 Regularization: 0.0008
Iter: 12680 Training Loss: -721.6279
Negative Log Likelihood: 3.5503 Sigma2 Prior: -725.1790 Regularization: 0.0008
Iter: 12690 Training Loss: -726.4568
Negative Log Likelihood: 3.7244 Sigma2 Prior: -730.1819 Regularization: 0.0008
Iter: 12700 Training Loss: -726.5385
Negative Log Likelihood: 3.6707 Sigma2 Prior: -730.2100 Regularization: 0.0008
Iter: 12710 Training Loss: -711.7520
Negative Log Likelihood: 3.5652 Sigma2 Prior: -715.3179 Regularization: 0.0008
Iter: 12720 Training Loss: -728.9839
Negative Log Likelihood: 3.1475 Sigma2 Prior: -732.1322 Regularization: 0.0008
Iter: 12730 Training Loss: -722.4503
Negative Log Likelihood: 4.0149 Sigma2 Prior: -726.4659 Regularization: 0.0008
Iter: 12740 Training Loss: -723.7309
Negative Log Likelihood: 3.7475 Sigma2 Prior: -727.4791 Regularization: 0.0008
Iter: 12750 Training Loss: -717.5538
Negative Log Likelihood: 4.1297 Sigma2 Prior: -721.6843 Regularization: 0.0008
Iter: 12760 Training Loss: -730.6862
Negative Log Likelihood: 3.1021 Sigma2 Prior: -733.7890 Regularization: 0.0008
Iter: 12770 Training Loss: -732.0621
Negative Log Likelihood: 2.8329 Sigma2 Prior: -734.8958 Regularization: 0.0008
Iter: 12780 Training Loss: -723.4305
Negative Log Likelihood: 2.9763 Sigma2 Prior: -726.4076 Regularization: 0.0008
Iter: 12790 Training Loss: -730.6548
Negative Log Likelihood: 3.6304 Sigma2 Prior: -734.2860 Regularization: 0.0008
Iter: 12800 Training Loss: -726.2469
Negative Log Likelihood: 3.7657 Sigma2 Prior: -730.0134 Regularization: 0.0008
Iter: 12810 Training Loss: -728.6827
Negative Log Likelihood: 3.0982 Sigma2 Prior: -731.7816 Regularization: 0.0008
Iter: 12820 Training Loss: -727.0203
Negative Log Likelihood: 2.8094 Sigma2 Prior: -729.8304 Regularization: 0.0008
Iter: 12830 Training Loss: -724.0698
Negative Log Likelihood: 3.6184 Sigma2 Prior: -727.6890 Regularization: 0.0008
Iter: 12840 Training Loss: -714.7251
Negative Log Likelihood: 4.1156 Sigma2 Prior: -718.8415 Regularization: 0.0008
Iter: 12850 Training Loss: -727.4564
Negative Log Likelihood: 3.5461 Sigma2 Prior: -731.0033 Regularization: 0.0008
Iter: 12860 Training Loss: -715.9147
Negative Log Likelihood: 4.1665 Sigma2 Prior: -720.0820 Regularization: 0.0008
Iter: 12870 Training Loss: -720.2876
Negative Log Likelihood: 4.5619 Sigma2 Prior: -724.8503 Regularization: 0.0008
Iter: 12880 Training Loss: -727.4487
Negative Log Likelihood: 3.5620 Sigma2 Prior: -731.0115 Regularization: 0.0008
Iter: 12890 Training Loss: -727.3910
Negative Log Likelihood: 3.6716 Sigma2 Prior: -731.0634 Regularization: 0.0008
Iter: 12900 Training Loss: -725.3500
Negative Log Likelihood: 3.7389 Sigma2 Prior: -729.0897 Regularization: 0.0008
Iter: 12910 Training Loss: -713.1401
Negative Log Likelihood: 3.8660 Sigma2 Prior: -717.0070 Regularization: 0.0008
Iter: 12920 Training Loss: -728.9893
Negative Log Likelihood: 3.0363 Sigma2 Prior: -732.0264 Regularization: 0.0008
Iter: 12930 Training Loss: -719.3953
Negative Log Likelihood: 4.2140 Sigma2 Prior: -723.6100 Regularization: 0.0008
Iter: 12940 Training Loss: -721.6604
Negative Log Likelihood: 4.4205 Sigma2 Prior: -726.0817 Regularization: 0.0008
Iter: 12950 Training Loss: -730.2007
Negative Log Likelihood: 3.3822 Sigma2 Prior: -733.5837 Regularization: 0.0008
Iter: 12960 Training Loss: -733.1722
Negative Log Likelihood: 2.9617 Sigma2 Prior: -736.1346 Regularization: 0.0008
Iter: 12970 Training Loss: -734.9459
Negative Log Likelihood: 3.2513 Sigma2 Prior: -738.1980 Regularization: 0.0008
Iter: 12980 Training Loss: -719.4847
Negative Log Likelihood: 3.5249 Sigma2 Prior: -723.0104 Regularization: 0.0008
Iter: 12990 Training Loss: -724.6984
Negative Log Likelihood: 3.2580 Sigma2 Prior: -727.9572 Regularization: 0.0008
Iter: 13000 Training Loss: -729.8185
Negative Log Likelihood: 3.4843 Sigma2 Prior: -733.3036 Regularization: 0.0008
Iter: 13010 Training Loss: -720.7446
Negative Log Likelihood: 3.5382 Sigma2 Prior: -724.2837 Regularization: 0.0008
Iter: 13020 Training Loss: -719.2170
Negative Log Likelihood: 2.8943 Sigma2 Prior: -722.1121 Regularization: 0.0008
Iter: 13030 Training Loss: -723.8671
Negative Log Likelihood: 4.2712 Sigma2 Prior: -728.1391 Regularization: 0.0008
Iter: 13040 Training Loss: -725.1005
Negative Log Likelihood: 3.7469 Sigma2 Prior: -728.8481 Regularization: 0.0008
Iter: 13050 Training Loss: -727.5397
Negative Log Likelihood: 4.4162 Sigma2 Prior: -731.9567 Regularization: 0.0008
Iter: 13060 Training Loss: -722.6390
Negative Log Likelihood: 3.6719 Sigma2 Prior: -726.3117 Regularization: 0.0008
Iter: 13070 Training Loss: -724.7018
Negative Log Likelihood: 3.6041 Sigma2 Prior: -728.3068 Regularization: 0.0008
Iter: 13080 Training Loss: -720.9391
Negative Log Likelihood: 4.5460 Sigma2 Prior: -725.4859 Regularization: 0.0008
Iter: 13090 Training Loss: -720.9036
Negative Log Likelihood: 3.3625 Sigma2 Prior: -724.2669 Regularization: 0.0008
Iter: 13100 Training Loss: -720.6064
Negative Log Likelihood: 4.1714 Sigma2 Prior: -724.7786 Regularization: 0.0008
Iter: 13110 Training Loss: -716.5000
Negative Log Likelihood: 4.2592 Sigma2 Prior: -720.7599 Regularization: 0.0008
Iter: 13120 Training Loss: -725.1393
Negative Log Likelihood: 3.4441 Sigma2 Prior: -728.5842 Regularization: 0.0008
Iter: 13130 Training Loss: -724.1280
Negative Log Likelihood: 3.5345 Sigma2 Prior: -727.6633 Regularization: 0.0008
Iter: 13140 Training Loss: -726.7148
Negative Log Likelihood: 3.4540 Sigma2 Prior: -730.1696 Regularization: 0.0008
Iter: 13150 Training Loss: -728.0084
Negative Log Likelihood: 3.5563 Sigma2 Prior: -731.5655 Regularization: 0.0008
Iter: 13160 Training Loss: -718.1069
Negative Log Likelihood: 4.3297 Sigma2 Prior: -722.4374 Regularization: 0.0008
Iter: 13170 Training Loss: -720.3570
Negative Log Likelihood: 4.4194 Sigma2 Prior: -724.7772 Regularization: 0.0008
Iter: 13180 Training Loss: -727.7063
Negative Log Likelihood: 2.8123 Sigma2 Prior: -730.5194 Regularization: 0.0008
Iter: 13190 Training Loss: -724.5299
Negative Log Likelihood: 3.6565 Sigma2 Prior: -728.1872 Regularization: 0.0008
Iter: 13200 Training Loss: -728.8494
Negative Log Likelihood: 3.3088 Sigma2 Prior: -732.1591 Regularization: 0.0008
Iter: 13210 Training Loss: -721.4011
Negative Log Likelihood: 4.2755 Sigma2 Prior: -725.6774 Regularization: 0.0008
Iter: 13220 Training Loss: -733.1317
Negative Log Likelihood: 3.5799 Sigma2 Prior: -736.7123 Regularization: 0.0008
Iter: 13230 Training Loss: -728.8734
Negative Log Likelihood: 4.0809 Sigma2 Prior: -732.9550 Regularization: 0.0008
Iter: 13240 Training Loss: -722.0150
Negative Log Likelihood: 2.9256 Sigma2 Prior: -724.9414 Regularization: 0.0008
Iter: 13250 Training Loss: -728.3357
Negative Log Likelihood: 3.2906 Sigma2 Prior: -731.6271 Regularization: 0.0008
Iter: 13260 Training Loss: -730.3991
Negative Log Likelihood: 2.8897 Sigma2 Prior: -733.2896 Regularization: 0.0008
Iter: 13270 Training Loss: -717.5111
Negative Log Likelihood: 4.3230 Sigma2 Prior: -721.8348 Regularization: 0.0008
Iter: 13280 Training Loss: -729.9995
Negative Log Likelihood: 3.1304 Sigma2 Prior: -733.1307 Regularization: 0.0008
Iter: 13290 Training Loss: -729.0053
Negative Log Likelihood: 3.0920 Sigma2 Prior: -732.0981 Regularization: 0.0008
Iter: 13300 Training Loss: -720.8190
Negative Log Likelihood: 3.9410 Sigma2 Prior: -724.7608 Regularization: 0.0008
Iter: 13310 Training Loss: -711.2569
Negative Log Likelihood: 3.8132 Sigma2 Prior: -715.0709 Regularization: 0.0008
Iter: 13320 Training Loss: -725.0544
Negative Log Likelihood: 3.8070 Sigma2 Prior: -728.8622 Regularization: 0.0008
Iter: 13330 Training Loss: -716.6427
Negative Log Likelihood: 3.6619 Sigma2 Prior: -720.3054 Regularization: 0.0008
Iter: 13340 Training Loss: -712.7199
Negative Log Likelihood: 4.1405 Sigma2 Prior: -716.8612 Regularization: 0.0008
Iter: 13350 Training Loss: -727.3134
Negative Log Likelihood: 3.2376 Sigma2 Prior: -730.5519 Regularization: 0.0008
Iter: 13360 Training Loss: -730.3985
Negative Log Likelihood: 3.8844 Sigma2 Prior: -734.2837 Regularization: 0.0008
Iter: 13370 Training Loss: -723.4307
Negative Log Likelihood: 4.1813 Sigma2 Prior: -727.6129 Regularization: 0.0008
Iter: 13380 Training Loss: -721.7761
Negative Log Likelihood: 3.4020 Sigma2 Prior: -725.1788 Regularization: 0.0008
Iter: 13390 Training Loss: -731.8708
Negative Log Likelihood: 3.9191 Sigma2 Prior: -735.7907 Regularization: 0.0008
Iter: 13400 Training Loss: -718.4838
Negative Log Likelihood: 3.5437 Sigma2 Prior: -722.0283 Regularization: 0.0008
Iter: 13410 Training Loss: -723.5879
Negative Log Likelihood: 2.8777 Sigma2 Prior: -726.4664 Regularization: 0.0008
Iter: 13420 Training Loss: -725.9654
Negative Log Likelihood: 3.4041 Sigma2 Prior: -729.3703 Regularization: 0.0008
Iter: 13430 Training Loss: -722.0612
Negative Log Likelihood: 3.6151 Sigma2 Prior: -725.6771 Regularization: 0.0008
Iter: 13440 Training Loss: -725.4454
Negative Log Likelihood: 3.4286 Sigma2 Prior: -728.8748 Regularization: 0.0008
Iter: 13450 Training Loss: -732.7280
Negative Log Likelihood: 3.2970 Sigma2 Prior: -736.0258 Regularization: 0.0008
Iter: 13460 Training Loss: -731.3853
Negative Log Likelihood: 3.1817 Sigma2 Prior: -734.5677 Regularization: 0.0008
Iter: 13470 Training Loss: -725.6629
Negative Log Likelihood: 4.6409 Sigma2 Prior: -730.3046 Regularization: 0.0008
Iter: 13480 Training Loss: -729.3610
Negative Log Likelihood: 3.3477 Sigma2 Prior: -732.7095 Regularization: 0.0008
Iter: 13490 Training Loss: -727.1710
Negative Log Likelihood: 3.3950 Sigma2 Prior: -730.5668 Regularization: 0.0008
Iter: 13500 Training Loss: -729.3737
Negative Log Likelihood: 3.8823 Sigma2 Prior: -733.2568 Regularization: 0.0008
Iter: 13510 Training Loss: -727.6536
Negative Log Likelihood: 3.6322 Sigma2 Prior: -731.2866 Regularization: 0.0008
Iter: 13520 Training Loss: -718.6342
Negative Log Likelihood: 4.1883 Sigma2 Prior: -722.8232 Regularization: 0.0008
Iter: 13530 Training Loss: -723.3541
Negative Log Likelihood: 3.8583 Sigma2 Prior: -727.2132 Regularization: 0.0008
Iter: 13540 Training Loss: -724.7639
Negative Log Likelihood: 3.7299 Sigma2 Prior: -728.4946 Regularization: 0.0008
Iter: 13550 Training Loss: -725.0974
Negative Log Likelihood: 3.3150 Sigma2 Prior: -728.4132 Regularization: 0.0008
Iter: 13560 Training Loss: -714.6058
Negative Log Likelihood: 3.5230 Sigma2 Prior: -718.1296 Regularization: 0.0008
Iter: 13570 Training Loss: -724.3824
Negative Log Likelihood: 3.5797 Sigma2 Prior: -727.9629 Regularization: 0.0008
Iter: 13580 Training Loss: -717.4778
Negative Log Likelihood: 3.3818 Sigma2 Prior: -720.8605 Regularization: 0.0008
Iter: 13590 Training Loss: -727.7801
Negative Log Likelihood: 3.4399 Sigma2 Prior: -731.2208 Regularization: 0.0008
Iter: 13600 Training Loss: -725.5418
Negative Log Likelihood: 3.2643 Sigma2 Prior: -728.8069 Regularization: 0.0008
Iter: 13610 Training Loss: -730.4012
Negative Log Likelihood: 3.6673 Sigma2 Prior: -734.0693 Regularization: 0.0008
Iter: 13620 Training Loss: -729.5269
Negative Log Likelihood: 3.5235 Sigma2 Prior: -733.0513 Regularization: 0.0008
Iter: 13630 Training Loss: -736.1939
Negative Log Likelihood: 3.5657 Sigma2 Prior: -739.7604 Regularization: 0.0008
Iter: 13640 Training Loss: -722.6577
Negative Log Likelihood: 3.7485 Sigma2 Prior: -726.4069 Regularization: 0.0008
Iter: 13650 Training Loss: -732.8384
Negative Log Likelihood: 3.7178 Sigma2 Prior: -736.5570 Regularization: 0.0008
Iter: 13660 Training Loss: -729.5154
Negative Log Likelihood: 3.1618 Sigma2 Prior: -732.6780 Regularization: 0.0008
Iter: 13670 Training Loss: -728.8503
Negative Log Likelihood: 3.5207 Sigma2 Prior: -732.3718 Regularization: 0.0008
Iter: 13680 Training Loss: -724.7092
Negative Log Likelihood: 3.4491 Sigma2 Prior: -728.1591 Regularization: 0.0008
Iter: 13690 Training Loss: -713.3193
Negative Log Likelihood: 4.1902 Sigma2 Prior: -717.5103 Regularization: 0.0008
Iter: 13700 Training Loss: -720.2062
Negative Log Likelihood: 4.1034 Sigma2 Prior: -724.3105 Regularization: 0.0008
Iter: 13710 Training Loss: -724.8925
Negative Log Likelihood: 4.0388 Sigma2 Prior: -728.9321 Regularization: 0.0008
Iter: 13720 Training Loss: -728.6096
Negative Log Likelihood: 3.4826 Sigma2 Prior: -732.0930 Regularization: 0.0008
Iter: 13730 Training Loss: -723.2570
Negative Log Likelihood: 3.7150 Sigma2 Prior: -726.9728 Regularization: 0.0008
Iter: 13740 Training Loss: -725.5536
Negative Log Likelihood: 4.0558 Sigma2 Prior: -729.6103 Regularization: 0.0008
Iter: 13750 Training Loss: -727.4797
Negative Log Likelihood: 3.4873 Sigma2 Prior: -730.9678 Regularization: 0.0008
Iter: 13760 Training Loss: -724.2272
Negative Log Likelihood: 3.6703 Sigma2 Prior: -727.8983 Regularization: 0.0008
Iter: 13770 Training Loss: -727.1893
Negative Log Likelihood: 3.4834 Sigma2 Prior: -730.6735 Regularization: 0.0008
Iter: 13780 Training Loss: -714.9196
Negative Log Likelihood: 3.4399 Sigma2 Prior: -718.3602 Regularization: 0.0008
Iter: 13790 Training Loss: -728.2878
Negative Log Likelihood: 3.3928 Sigma2 Prior: -731.6814 Regularization: 0.0008
Iter: 13800 Training Loss: -723.0494
Negative Log Likelihood: 3.9224 Sigma2 Prior: -726.9727 Regularization: 0.0008
Iter: 13810 Training Loss: -732.3576
Negative Log Likelihood: 3.2929 Sigma2 Prior: -735.6513 Regularization: 0.0008
Iter: 13820 Training Loss: -723.1444
Negative Log Likelihood: 3.8525 Sigma2 Prior: -726.9977 Regularization: 0.0008
Iter: 13830 Training Loss: -728.6096
Negative Log Likelihood: 3.1276 Sigma2 Prior: -731.7380 Regularization: 0.0008
Iter: 13840 Training Loss: -718.5111
Negative Log Likelihood: 3.2789 Sigma2 Prior: -721.7908 Regularization: 0.0008
Iter: 13850 Training Loss: -733.4122
Negative Log Likelihood: 3.0138 Sigma2 Prior: -736.4268 Regularization: 0.0008
Iter: 13860 Training Loss: -731.3742
Negative Log Likelihood: 2.9252 Sigma2 Prior: -734.3002 Regularization: 0.0008
Iter: 13870 Training Loss: -726.7941
Negative Log Likelihood: 3.2984 Sigma2 Prior: -730.0933 Regularization: 0.0008
Iter: 13880 Training Loss: -725.8587
Negative Log Likelihood: 3.1994 Sigma2 Prior: -729.0590 Regularization: 0.0008
Iter: 13890 Training Loss: -719.8221
Negative Log Likelihood: 4.0403 Sigma2 Prior: -723.8632 Regularization: 0.0008
Iter: 13900 Training Loss: -727.0063
Negative Log Likelihood: 3.3290 Sigma2 Prior: -730.3361 Regularization: 0.0008
Iter: 13910 Training Loss: -727.1622
Negative Log Likelihood: 3.3356 Sigma2 Prior: -730.4986 Regularization: 0.0008
Iter: 13920 Training Loss: -737.1961
Negative Log Likelihood: 3.0123 Sigma2 Prior: -740.2092 Regularization: 0.0008
Iter: 13930 Training Loss: -734.3430
Negative Log Likelihood: 3.2148 Sigma2 Prior: -737.5585 Regularization: 0.0008
Iter: 13940 Training Loss: -727.8937
Negative Log Likelihood: 4.1705 Sigma2 Prior: -732.0650 Regularization: 0.0008
Iter: 13950 Training Loss: -726.4794
Negative Log Likelihood: 3.4525 Sigma2 Prior: -729.9327 Regularization: 0.0008
Iter: 13960 Training Loss: -716.4472
Negative Log Likelihood: 3.7432 Sigma2 Prior: -720.1912 Regularization: 0.0008
Iter: 13970 Training Loss: -730.1626
Negative Log Likelihood: 3.3663 Sigma2 Prior: -733.5297 Regularization: 0.0008
Iter: 13980 Training Loss: -721.2330
Negative Log Likelihood: 3.3801 Sigma2 Prior: -724.6139 Regularization: 0.0008
Iter: 13990 Training Loss: -732.6836
Negative Log Likelihood: 3.7389 Sigma2 Prior: -736.4232 Regularization: 0.0008
Iter: 14000 Training Loss: -717.2621
Negative Log Likelihood: 4.4063 Sigma2 Prior: -721.6693 Regularization: 0.0008
Iter: 14010 Training Loss: -726.5751
Negative Log Likelihood: 3.3372 Sigma2 Prior: -729.9131 Regularization: 0.0008
Iter: 14020 Training Loss: -725.8284
Negative Log Likelihood: 3.7214 Sigma2 Prior: -729.5507 Regularization: 0.0008
Iter: 14030 Training Loss: -719.1697
Negative Log Likelihood: 3.8629 Sigma2 Prior: -723.0333 Regularization: 0.0008
Iter: 14040 Training Loss: -723.5782
Negative Log Likelihood: 3.2601 Sigma2 Prior: -726.8391 Regularization: 0.0008
Iter: 14050 Training Loss: -717.7394
Negative Log Likelihood: 4.7055 Sigma2 Prior: -722.4457 Regularization: 0.0008
Iter: 14060 Training Loss: -732.6386
Negative Log Likelihood: 3.0503 Sigma2 Prior: -735.6896 Regularization: 0.0008
Iter: 14070 Training Loss: -721.7458
Negative Log Likelihood: 4.0096 Sigma2 Prior: -725.7562 Regularization: 0.0008
Iter: 14080 Training Loss: -717.5995
Negative Log Likelihood: 4.1298 Sigma2 Prior: -721.7301 Regularization: 0.0008
Iter: 14090 Training Loss: -714.5283
Negative Log Likelihood: 4.0712 Sigma2 Prior: -718.6003 Regularization: 0.0008
Iter: 14100 Training Loss: -730.2908
Negative Log Likelihood: 3.2590 Sigma2 Prior: -733.5507 Regularization: 0.0008
Iter: 14110 Training Loss: -720.6002
Negative Log Likelihood: 3.9314 Sigma2 Prior: -724.5323 Regularization: 0.0008
Iter: 14120 Training Loss: -736.7230
Negative Log Likelihood: 3.9705 Sigma2 Prior: -740.6942 Regularization: 0.0008
Iter: 14130 Training Loss: -718.1611
Negative Log Likelihood: 3.5868 Sigma2 Prior: -721.7487 Regularization: 0.0008
Iter: 14140 Training Loss: -725.6542
Negative Log Likelihood: 3.2142 Sigma2 Prior: -728.8692 Regularization: 0.0008
Iter: 14150 Training Loss: -717.8340
Negative Log Likelihood: 3.2974 Sigma2 Prior: -721.1321 Regularization: 0.0008
Iter: 14160 Training Loss: -713.5913
Negative Log Likelihood: 3.7608 Sigma2 Prior: -717.3529 Regularization: 0.0008
Iter: 14170 Training Loss: -718.6609
Negative Log Likelihood: 3.7997 Sigma2 Prior: -722.4614 Regularization: 0.0008
Iter: 14180 Training Loss: -722.7673
Negative Log Likelihood: 3.6235 Sigma2 Prior: -726.3917 Regularization: 0.0008
Iter: 14190 Training Loss: -729.0228
Negative Log Likelihood: 3.2833 Sigma2 Prior: -732.3069 Regularization: 0.0008
Iter: 14200 Training Loss: -730.4373
Negative Log Likelihood: 3.4011 Sigma2 Prior: -733.8392 Regularization: 0.0008
Iter: 14210 Training Loss: -729.1077
Negative Log Likelihood: 3.4885 Sigma2 Prior: -732.5969 Regularization: 0.0008
Iter: 14220 Training Loss: -713.1120
Negative Log Likelihood: 4.5156 Sigma2 Prior: -717.6284 Regularization: 0.0008
Iter: 14230 Training Loss: -722.6999
Negative Log Likelihood: 2.8907 Sigma2 Prior: -725.5914 Regularization: 0.0008
Iter: 14240 Training Loss: -725.7751
Negative Log Likelihood: 3.8852 Sigma2 Prior: -729.6611 Regularization: 0.0008
Iter: 14250 Training Loss: -730.8004
Negative Log Likelihood: 3.5785 Sigma2 Prior: -734.3797 Regularization: 0.0008
Iter: 14260 Training Loss: -728.7823
Negative Log Likelihood: 3.7911 Sigma2 Prior: -732.5742 Regularization: 0.0008
Iter: 14270 Training Loss: -725.9639
Negative Log Likelihood: 3.7105 Sigma2 Prior: -729.6752 Regularization: 0.0008
Iter: 14280 Training Loss: -724.4317
Negative Log Likelihood: 3.5184 Sigma2 Prior: -727.9509 Regularization: 0.0008
Iter: 14290 Training Loss: -720.5683
Negative Log Likelihood: 3.6344 Sigma2 Prior: -724.2035 Regularization: 0.0008
Iter: 14300 Training Loss: -712.6371
Negative Log Likelihood: 3.6190 Sigma2 Prior: -716.2568 Regularization: 0.0008
Iter: 14310 Training Loss: -724.8564
Negative Log Likelihood: 3.2441 Sigma2 Prior: -728.1013 Regularization: 0.0008
Iter: 14320 Training Loss: -723.2583
Negative Log Likelihood: 3.5267 Sigma2 Prior: -726.7858 Regularization: 0.0008
Iter: 14330 Training Loss: -727.4947
Negative Log Likelihood: 3.8211 Sigma2 Prior: -731.3166 Regularization: 0.0008
Iter: 14340 Training Loss: -726.6640
Negative Log Likelihood: 3.0741 Sigma2 Prior: -729.7390 Regularization: 0.0008
Iter: 14350 Training Loss: -726.9514
Negative Log Likelihood: 3.1412 Sigma2 Prior: -730.0934 Regularization: 0.0008
Iter: 14360 Training Loss: -729.8117
Negative Log Likelihood: 3.3723 Sigma2 Prior: -733.1848 Regularization: 0.0008
Iter: 14370 Training Loss: -730.3243
Negative Log Likelihood: 3.3657 Sigma2 Prior: -733.6908 Regularization: 0.0008
Iter: 14380 Training Loss: -727.7057
Negative Log Likelihood: 3.7205 Sigma2 Prior: -731.4270 Regularization: 0.0008
Iter: 14390 Training Loss: -727.6046
Negative Log Likelihood: 3.4038 Sigma2 Prior: -731.0092 Regularization: 0.0008
Iter: 14400 Training Loss: -725.5700
Negative Log Likelihood: 3.7285 Sigma2 Prior: -729.2993 Regularization: 0.0008
Iter: 14410 Training Loss: -730.8384
Negative Log Likelihood: 2.5281 Sigma2 Prior: -733.3674 Regularization: 0.0008
Iter: 14420 Training Loss: -720.7393
Negative Log Likelihood: 3.6891 Sigma2 Prior: -724.4292 Regularization: 0.0008
Iter: 14430 Training Loss: -724.5306
Negative Log Likelihood: 3.4465 Sigma2 Prior: -727.9779 Regularization: 0.0008
Iter: 14440 Training Loss: -735.5886
Negative Log Likelihood: 3.2223 Sigma2 Prior: -738.8116 Regularization: 0.0008
Iter: 14450 Training Loss: -726.7386
Negative Log Likelihood: 3.1021 Sigma2 Prior: -729.8415 Regularization: 0.0008
Iter: 14460 Training Loss: -714.9628
Negative Log Likelihood: 3.7026 Sigma2 Prior: -718.6662 Regularization: 0.0008
Iter: 14470 Training Loss: -717.0710
Negative Log Likelihood: 3.5259 Sigma2 Prior: -720.5977 Regularization: 0.0008
Iter: 14480 Training Loss: -735.9781
Negative Log Likelihood: 2.8408 Sigma2 Prior: -738.8198 Regularization: 0.0008
Iter: 14490 Training Loss: -723.3149
Negative Log Likelihood: 3.2093 Sigma2 Prior: -726.5251 Regularization: 0.0008
Iter: 14500 Training Loss: -724.2731
Negative Log Likelihood: 3.4823 Sigma2 Prior: -727.7562 Regularization: 0.0008
Iter: 14510 Training Loss: -725.1772
Negative Log Likelihood: 3.8091 Sigma2 Prior: -728.9871 Regularization: 0.0008
Iter: 14520 Training Loss: -735.7334
Negative Log Likelihood: 3.2410 Sigma2 Prior: -738.9752 Regularization: 0.0008
Iter: 14530 Training Loss: -720.1290
Negative Log Likelihood: 3.7406 Sigma2 Prior: -723.8704 Regularization: 0.0008
Iter: 14540 Training Loss: -730.8736
Negative Log Likelihood: 2.6160 Sigma2 Prior: -733.4904 Regularization: 0.0008
Iter: 14550 Training Loss: -730.5919
Negative Log Likelihood: 3.7283 Sigma2 Prior: -734.3210 Regularization: 0.0008
Iter: 14560 Training Loss: -733.7448
Negative Log Likelihood: 3.2702 Sigma2 Prior: -737.0157 Regularization: 0.0008
Iter: 14570 Training Loss: -730.3987
Negative Log Likelihood: 3.5189 Sigma2 Prior: -733.9185 Regularization: 0.0008
Iter: 14580 Training Loss: -726.6307
Negative Log Likelihood: 3.5566 Sigma2 Prior: -730.1881 Regularization: 0.0008
Iter: 14590 Training Loss: -732.8688
Negative Log Likelihood: 3.3725 Sigma2 Prior: -736.2421 Regularization: 0.0008
Iter: 14600 Training Loss: -724.9416
Negative Log Likelihood: 3.1550 Sigma2 Prior: -728.0974 Regularization: 0.0008
Iter: 14610 Training Loss: -720.2283
Negative Log Likelihood: 3.8919 Sigma2 Prior: -724.1210 Regularization: 0.0008
Iter: 14620 Training Loss: -728.5530
Negative Log Likelihood: 3.3108 Sigma2 Prior: -731.8646 Regularization: 0.0008
Iter: 14630 Training Loss: -730.9967
Negative Log Likelihood: 2.9166 Sigma2 Prior: -733.9141 Regularization: 0.0008
Iter: 14640 Training Loss: -734.2225
Negative Log Likelihood: 3.5963 Sigma2 Prior: -737.8196 Regularization: 0.0008
Iter: 14650 Training Loss: -728.6927
Negative Log Likelihood: 3.5367 Sigma2 Prior: -732.2302 Regularization: 0.0008
Iter: 14660 Training Loss: -721.0544
Negative Log Likelihood: 3.8051 Sigma2 Prior: -724.8604 Regularization: 0.0008
Iter: 14670 Training Loss: -717.2855
Negative Log Likelihood: 3.2716 Sigma2 Prior: -720.5579 Regularization: 0.0008
Iter: 14680 Training Loss: -726.0283
Negative Log Likelihood: 3.8012 Sigma2 Prior: -729.8303 Regularization: 0.0008
Iter: 14690 Training Loss: -732.8241
Negative Log Likelihood: 2.7914 Sigma2 Prior: -735.6163 Regularization: 0.0008
Iter: 14700 Training Loss: -722.5934
Negative Log Likelihood: 3.8699 Sigma2 Prior: -726.4641 Regularization: 0.0008
Iter: 14710 Training Loss: -729.8630
Negative Log Likelihood: 3.5858 Sigma2 Prior: -733.4496 Regularization: 0.0008
Iter: 14720 Training Loss: -717.6820
Negative Log Likelihood: 3.5461 Sigma2 Prior: -721.2289 Regularization: 0.0008
Iter: 14730 Training Loss: -719.6386
Negative Log Likelihood: 4.3973 Sigma2 Prior: -724.0367 Regularization: 0.0008
Iter: 14740 Training Loss: -725.7419
Negative Log Likelihood: 3.1637 Sigma2 Prior: -728.9064 Regularization: 0.0008
Iter: 14750 Training Loss: -720.0470
Negative Log Likelihood: 4.3057 Sigma2 Prior: -724.3535 Regularization: 0.0008
Iter: 14760 Training Loss: -723.6926
Negative Log Likelihood: 3.5210 Sigma2 Prior: -727.2144 Regularization: 0.0008
Iter: 14770 Training Loss: -726.4612
Negative Log Likelihood: 3.4252 Sigma2 Prior: -729.8871 Regularization: 0.0008
Iter: 14780 Training Loss: -720.5911
Negative Log Likelihood: 3.4470 Sigma2 Prior: -724.0388 Regularization: 0.0008
Iter: 14790 Training Loss: -723.7802
Negative Log Likelihood: 3.5508 Sigma2 Prior: -727.3318 Regularization: 0.0008
Iter: 14800 Training Loss: -729.1750
Negative Log Likelihood: 2.7871 Sigma2 Prior: -731.9629 Regularization: 0.0008
Iter: 14810 Training Loss: -727.3271
Negative Log Likelihood: 3.5358 Sigma2 Prior: -730.8637 Regularization: 0.0008
Iter: 14820 Training Loss: -736.9135
Negative Log Likelihood: 2.4996 Sigma2 Prior: -739.4138 Regularization: 0.0008
Iter: 14830 Training Loss: -727.8967
Negative Log Likelihood: 3.7229 Sigma2 Prior: -731.6204 Regularization: 0.0008
Iter: 14840 Training Loss: -734.5555
Negative Log Likelihood: 3.5068 Sigma2 Prior: -738.0630 Regularization: 0.0008
Iter: 14850 Training Loss: -725.9626
Negative Log Likelihood: 3.6863 Sigma2 Prior: -729.6497 Regularization: 0.0008
Iter: 14860 Training Loss: -734.7877
Negative Log Likelihood: 2.7711 Sigma2 Prior: -737.5596 Regularization: 0.0008
Iter: 14870 Training Loss: -718.4092
Negative Log Likelihood: 3.6698 Sigma2 Prior: -722.0798 Regularization: 0.0008
Iter: 14880 Training Loss: -720.7664
Negative Log Likelihood: 3.6802 Sigma2 Prior: -724.4473 Regularization: 0.0008
Iter: 14890 Training Loss: -721.6853
Negative Log Likelihood: 3.0887 Sigma2 Prior: -724.7748 Regularization: 0.0008
Iter: 14900 Training Loss: -722.6177
Negative Log Likelihood: 4.0256 Sigma2 Prior: -726.6440 Regularization: 0.0008
Iter: 14910 Training Loss: -714.7865
Negative Log Likelihood: 3.9971 Sigma2 Prior: -718.7844 Regularization: 0.0008
Iter: 14920 Training Loss: -731.6438
Negative Log Likelihood: 3.4559 Sigma2 Prior: -735.1005 Regularization: 0.0008
Iter: 14930 Training Loss: -723.5015
Negative Log Likelihood: 3.7003 Sigma2 Prior: -727.2026 Regularization: 0.0008
Iter: 14940 Training Loss: -730.1889
Negative Log Likelihood: 3.0270 Sigma2 Prior: -733.2167 Regularization: 0.0008
Iter: 14950 Training Loss: -731.9654
Negative Log Likelihood: 2.9490 Sigma2 Prior: -734.9152 Regularization: 0.0008
Iter: 14960 Training Loss: -717.0655
Negative Log Likelihood: 3.5944 Sigma2 Prior: -720.6606 Regularization: 0.0008
Iter: 14970 Training Loss: -718.9545
Negative Log Likelihood: 3.3225 Sigma2 Prior: -722.2778 Regularization: 0.0008
Iter: 14980 Training Loss: -716.7421
Negative Log Likelihood: 4.0725 Sigma2 Prior: -720.8154 Regularization: 0.0008
Iter: 14990 Training Loss: -731.1640
Negative Log Likelihood: 3.2270 Sigma2 Prior: -734.3917 Regularization: 0.0008
Iter: 15000 Training Loss: -735.3572
Negative Log Likelihood: 3.5040 Sigma2 Prior: -738.8619 Regularization: 0.0008
Iter: 15010 Training Loss: -727.1234
Negative Log Likelihood: 3.2996 Sigma2 Prior: -730.4237 Regularization: 0.0008
Iter: 15020 Training Loss: -718.8419
Negative Log Likelihood: 4.6175 Sigma2 Prior: -723.4602 Regularization: 0.0008
Iter: 15030 Training Loss: -722.8768
Negative Log Likelihood: 4.0708 Sigma2 Prior: -726.9484 Regularization: 0.0008
Iter: 15040 Training Loss: -725.3853
Negative Log Likelihood: 3.2381 Sigma2 Prior: -728.6241 Regularization: 0.0008
Iter: 15050 Training Loss: -733.6292
Negative Log Likelihood: 3.0404 Sigma2 Prior: -736.6703 Regularization: 0.0008
Iter: 15060 Training Loss: -735.0917
Negative Log Likelihood: 2.9468 Sigma2 Prior: -738.0393 Regularization: 0.0008
Iter: 15070 Training Loss: -714.1707
Negative Log Likelihood: 3.9244 Sigma2 Prior: -718.0959 Regularization: 0.0008
Iter: 15080 Training Loss: -718.6213
Negative Log Likelihood: 3.5886 Sigma2 Prior: -722.2108 Regularization: 0.0008
Iter: 15090 Training Loss: -726.1859
Negative Log Likelihood: 4.1543 Sigma2 Prior: -730.3409 Regularization: 0.0008
Iter: 15100 Training Loss: -713.4239
Negative Log Likelihood: 3.9871 Sigma2 Prior: -717.4118 Regularization: 0.0008
Iter: 15110 Training Loss: -732.3220
Negative Log Likelihood: 3.7230 Sigma2 Prior: -736.0458 Regularization: 0.0008
Iter: 15120 Training Loss: -715.3444
Negative Log Likelihood: 3.8124 Sigma2 Prior: -719.1576 Regularization: 0.0008
Iter: 15130 Training Loss: -717.6272
Negative Log Likelihood: 4.1589 Sigma2 Prior: -721.7869 Regularization: 0.0008
Iter: 15140 Training Loss: -717.8400
Negative Log Likelihood: 3.9506 Sigma2 Prior: -721.7914 Regularization: 0.0008
Iter: 15150 Training Loss: -722.3611
Negative Log Likelihood: 3.7668 Sigma2 Prior: -726.1287 Regularization: 0.0008
Iter: 15160 Training Loss: -719.8837
Negative Log Likelihood: 4.3742 Sigma2 Prior: -724.2587 Regularization: 0.0008
Iter: 15170 Training Loss: -719.0890
Negative Log Likelihood: 3.7783 Sigma2 Prior: -722.8680 Regularization: 0.0008
Iter: 15180 Training Loss: -735.4774
Negative Log Likelihood: 2.4943 Sigma2 Prior: -737.9725 Regularization: 0.0008
Iter: 15190 Training Loss: -723.1382
Negative Log Likelihood: 3.1074 Sigma2 Prior: -726.2465 Regularization: 0.0008
Iter: 15200 Training Loss: -718.9233
Negative Log Likelihood: 3.3560 Sigma2 Prior: -722.2801 Regularization: 0.0008
Iter: 15210 Training Loss: -724.0436
Negative Log Likelihood: 3.5389 Sigma2 Prior: -727.5833 Regularization: 0.0008
Iter: 15220 Training Loss: -721.5759
Negative Log Likelihood: 3.7748 Sigma2 Prior: -725.3515 Regularization: 0.0008
Iter: 15230 Training Loss: -728.1121
Negative Log Likelihood: 3.8501 Sigma2 Prior: -731.9630 Regularization: 0.0008
Iter: 15240 Training Loss: -733.6323
Negative Log Likelihood: 3.3587 Sigma2 Prior: -736.9917 Regularization: 0.0008
Iter: 15250 Training Loss: -722.8340
Negative Log Likelihood: 3.4661 Sigma2 Prior: -726.3009 Regularization: 0.0008
Iter: 15260 Training Loss: -719.9199
Negative Log Likelihood: 3.9514 Sigma2 Prior: -723.8720 Regularization: 0.0008
Iter: 15270 Training Loss: -720.1082
Negative Log Likelihood: 3.7509 Sigma2 Prior: -723.8599 Regularization: 0.0008
Iter: 15280 Training Loss: -730.2148
Negative Log Likelihood: 2.9906 Sigma2 Prior: -733.2062 Regularization: 0.0008
Iter: 15290 Training Loss: -730.8684
Negative Log Likelihood: 3.5483 Sigma2 Prior: -734.4175 Regularization: 0.0008
Iter: 15300 Training Loss: -725.6917
Negative Log Likelihood: 3.7778 Sigma2 Prior: -729.4703 Regularization: 0.0008
Iter: 15310 Training Loss: -724.1815
Negative Log Likelihood: 3.4968 Sigma2 Prior: -727.6790 Regularization: 0.0008
Iter: 15320 Training Loss: -721.0255
Negative Log Likelihood: 3.5760 Sigma2 Prior: -724.6024 Regularization: 0.0008
Iter: 15330 Training Loss: -722.2092
Negative Log Likelihood: 3.7090 Sigma2 Prior: -725.9190 Regularization: 0.0008
Iter: 15340 Training Loss: -722.3123
Negative Log Likelihood: 3.5159 Sigma2 Prior: -725.8290 Regularization: 0.0008
Iter: 15350 Training Loss: -724.9407
Negative Log Likelihood: 3.2391 Sigma2 Prior: -728.1806 Regularization: 0.0008
Iter: 15360 Training Loss: -724.2742
Negative Log Likelihood: 3.4747 Sigma2 Prior: -727.7497 Regularization: 0.0008
Iter: 15370 Training Loss: -723.4762
Negative Log Likelihood: 4.2779 Sigma2 Prior: -727.7549 Regularization: 0.0008
Iter: 15380 Training Loss: -719.5965
Negative Log Likelihood: 3.7773 Sigma2 Prior: -723.3746 Regularization: 0.0008
Iter: 15390 Training Loss: -720.2078
Negative Log Likelihood: 3.4989 Sigma2 Prior: -723.7075 Regularization: 0.0008
Iter: 15400 Training Loss: -730.5880
Negative Log Likelihood: 3.5900 Sigma2 Prior: -734.1788 Regularization: 0.0008
Iter: 15410 Training Loss: -719.0334
Negative Log Likelihood: 4.0037 Sigma2 Prior: -723.0379 Regularization: 0.0008
Iter: 15420 Training Loss: -727.9910
Negative Log Likelihood: 3.1189 Sigma2 Prior: -731.1107 Regularization: 0.0008
Iter: 15430 Training Loss: -716.4459
Negative Log Likelihood: 3.6054 Sigma2 Prior: -720.0521 Regularization: 0.0008
Iter: 15440 Training Loss: -731.8380
Negative Log Likelihood: 3.0709 Sigma2 Prior: -734.9097 Regularization: 0.0008
Iter: 15450 Training Loss: -713.4464
Negative Log Likelihood: 4.0903 Sigma2 Prior: -717.5375 Regularization: 0.0008
Iter: 15460 Training Loss: -725.7105
Negative Log Likelihood: 3.0005 Sigma2 Prior: -728.7119 Regularization: 0.0008
Iter: 15470 Training Loss: -724.9659
Negative Log Likelihood: 3.1378 Sigma2 Prior: -728.1046 Regularization: 0.0008
Iter: 15480 Training Loss: -719.1574
Negative Log Likelihood: 3.3848 Sigma2 Prior: -722.5431 Regularization: 0.0008
Iter: 15490 Training Loss: -717.8688
Negative Log Likelihood: 3.5424 Sigma2 Prior: -721.4120 Regularization: 0.0008
Iter: 15500 Training Loss: -728.4651
Negative Log Likelihood: 3.2021 Sigma2 Prior: -731.6681 Regularization: 0.0008
Iter: 15510 Training Loss: -723.8733
Negative Log Likelihood: 3.6072 Sigma2 Prior: -727.4813 Regularization: 0.0008
Iter: 15520 Training Loss: -724.9825
Negative Log Likelihood: 3.0637 Sigma2 Prior: -728.0471 Regularization: 0.0008
Iter: 15530 Training Loss: -721.2056
Negative Log Likelihood: 4.0614 Sigma2 Prior: -725.2678 Regularization: 0.0008
Iter: 15540 Training Loss: -723.4395
Negative Log Likelihood: 3.8171 Sigma2 Prior: -727.2574 Regularization: 0.0008
Iter: 15550 Training Loss: -721.1500
Negative Log Likelihood: 3.3806 Sigma2 Prior: -724.5315 Regularization: 0.0008
Iter: 15560 Training Loss: -713.5147
Negative Log Likelihood: 3.2945 Sigma2 Prior: -716.8101 Regularization: 0.0008
Iter: 15570 Training Loss: -727.2816
Negative Log Likelihood: 3.3960 Sigma2 Prior: -730.6784 Regularization: 0.0008
Iter: 15580 Training Loss: -723.0570
Negative Log Likelihood: 2.7610 Sigma2 Prior: -725.8188 Regularization: 0.0008
Iter: 15590 Training Loss: -719.3856
Negative Log Likelihood: 4.0602 Sigma2 Prior: -723.4466 Regularization: 0.0008
Iter: 15600 Training Loss: -725.8026
Negative Log Likelihood: 3.1622 Sigma2 Prior: -728.9656 Regularization: 0.0008
Iter: 15610 Training Loss: -721.7591
Negative Log Likelihood: 4.1448 Sigma2 Prior: -725.9048 Regularization: 0.0008
Iter: 15620 Training Loss: -727.0959
Negative Log Likelihood: 3.4657 Sigma2 Prior: -730.5624 Regularization: 0.0008
Iter: 15630 Training Loss: -728.8399
Negative Log Likelihood: 3.5027 Sigma2 Prior: -732.3434 Regularization: 0.0008
Iter: 15640 Training Loss: -728.8868
Negative Log Likelihood: 2.9418 Sigma2 Prior: -731.8295 Regularization: 0.0008
Iter: 15650 Training Loss: -735.6968
Negative Log Likelihood: 3.2581 Sigma2 Prior: -738.9558 Regularization: 0.0008
Iter: 15660 Training Loss: -726.0482
Negative Log Likelihood: 4.4374 Sigma2 Prior: -730.4865 Regularization: 0.0008
Iter: 15670 Training Loss: -717.9550
Negative Log Likelihood: 3.8205 Sigma2 Prior: -721.7764 Regularization: 0.0008
Iter: 15680 Training Loss: -725.8872
Negative Log Likelihood: 3.2461 Sigma2 Prior: -729.1341 Regularization: 0.0008
Iter: 15690 Training Loss: -719.6946
Negative Log Likelihood: 4.0897 Sigma2 Prior: -723.7852 Regularization: 0.0008
Iter: 15700 Training Loss: -704.2985
Negative Log Likelihood: 4.1518 Sigma2 Prior: -708.4512 Regularization: 0.0008
Iter: 15710 Training Loss: -720.1337
Negative Log Likelihood: 4.5612 Sigma2 Prior: -724.6957 Regularization: 0.0008
Iter: 15720 Training Loss: -722.9655
Negative Log Likelihood: 3.8416 Sigma2 Prior: -726.8079 Regularization: 0.0008
Iter: 15730 Training Loss: -725.4503
Negative Log Likelihood: 3.8382 Sigma2 Prior: -729.2892 Regularization: 0.0008
Iter: 15740 Training Loss: -724.0756
Negative Log Likelihood: 3.2745 Sigma2 Prior: -727.3510 Regularization: 0.0008
Iter: 15750 Training Loss: -727.9951
Negative Log Likelihood: 3.3095 Sigma2 Prior: -731.3055 Regularization: 0.0008
Iter: 15760 Training Loss: -725.6769
Negative Log Likelihood: 3.1900 Sigma2 Prior: -728.8678 Regularization: 0.0008
Iter: 15770 Training Loss: -723.0399
Negative Log Likelihood: 3.3517 Sigma2 Prior: -726.3924 Regularization: 0.0008
Iter: 15780 Training Loss: -739.0751
Negative Log Likelihood: 2.9640 Sigma2 Prior: -742.0399 Regularization: 0.0008
Iter: 15790 Training Loss: -726.7661
Negative Log Likelihood: 3.4331 Sigma2 Prior: -730.2001 Regularization: 0.0008
Iter: 15800 Training Loss: -727.9527
Negative Log Likelihood: 3.4059 Sigma2 Prior: -731.3594 Regularization: 0.0008
Iter: 15810 Training Loss: -737.8768
Negative Log Likelihood: 2.9245 Sigma2 Prior: -740.8021 Regularization: 0.0008
Iter: 15820 Training Loss: -710.4769
Negative Log Likelihood: 4.3279 Sigma2 Prior: -714.8057 Regularization: 0.0008
Iter: 15830 Training Loss: -725.3096
Negative Log Likelihood: 3.7511 Sigma2 Prior: -729.0616 Regularization: 0.0008
Iter: 15840 Training Loss: -727.7041
Negative Log Likelihood: 3.6697 Sigma2 Prior: -731.3747 Regularization: 0.0008
Iter: 15850 Training Loss: -719.2643
Negative Log Likelihood: 3.6455 Sigma2 Prior: -722.9106 Regularization: 0.0008
Iter: 15860 Training Loss: -725.5364
Negative Log Likelihood: 4.0331 Sigma2 Prior: -729.5704 Regularization: 0.0008
Iter: 15870 Training Loss: -737.0879
Negative Log Likelihood: 2.8516 Sigma2 Prior: -739.9404 Regularization: 0.0008
Iter: 15880 Training Loss: -731.5820
Negative Log Likelihood: 2.9842 Sigma2 Prior: -734.5670 Regularization: 0.0008
Iter: 15890 Training Loss: -724.3226
Negative Log Likelihood: 3.6459 Sigma2 Prior: -727.9693 Regularization: 0.0008
Iter: 15900 Training Loss: -721.3056
Negative Log Likelihood: 3.4182 Sigma2 Prior: -724.7247 Regularization: 0.0008
Iter: 15910 Training Loss: -732.7986
Negative Log Likelihood: 3.5569 Sigma2 Prior: -736.3564 Regularization: 0.0008
Iter: 15920 Training Loss: -715.9344
Negative Log Likelihood: 3.6305 Sigma2 Prior: -719.5658 Regularization: 0.0008
Iter: 15930 Training Loss: -723.5642
Negative Log Likelihood: 3.6396 Sigma2 Prior: -727.2047 Regularization: 0.0008
Iter: 15940 Training Loss: -715.9166
Negative Log Likelihood: 3.3591 Sigma2 Prior: -719.2766 Regularization: 0.0008
Iter: 15950 Training Loss: -729.8522
Negative Log Likelihood: 3.1699 Sigma2 Prior: -733.0230 Regularization: 0.0008
Iter: 15960 Training Loss: -717.8978
Negative Log Likelihood: 3.8798 Sigma2 Prior: -721.7784 Regularization: 0.0008
Iter: 15970 Training Loss: -732.9199
Negative Log Likelihood: 3.4991 Sigma2 Prior: -736.4198 Regularization: 0.0008
Iter: 15980 Training Loss: -730.3820
Negative Log Likelihood: 3.7165 Sigma2 Prior: -734.0994 Regularization: 0.0008
Iter: 15990 Training Loss: -719.2085
Negative Log Likelihood: 3.6975 Sigma2 Prior: -722.9069 Regularization: 0.0008
Iter: 16000 Training Loss: -715.5978
Negative Log Likelihood: 3.9648 Sigma2 Prior: -719.5635 Regularization: 0.0008
Iter: 16010 Training Loss: -720.5587
Negative Log Likelihood: 4.2113 Sigma2 Prior: -724.7709 Regularization: 0.0008
Iter: 16020 Training Loss: -726.9457
Negative Log Likelihood: 3.0224 Sigma2 Prior: -729.9690 Regularization: 0.0008
Iter: 16030 Training Loss: -732.6830
Negative Log Likelihood: 3.0839 Sigma2 Prior: -735.7678 Regularization: 0.0008
Iter: 16040 Training Loss: -715.8345
Negative Log Likelihood: 3.4627 Sigma2 Prior: -719.2981 Regularization: 0.0008
Iter: 16050 Training Loss: -723.5329
Negative Log Likelihood: 2.9436 Sigma2 Prior: -726.4773 Regularization: 0.0008
Iter: 16060 Training Loss: -720.7659
Negative Log Likelihood: 3.8147 Sigma2 Prior: -724.5814 Regularization: 0.0008
Iter: 16070 Training Loss: -717.4725
Negative Log Likelihood: 4.0949 Sigma2 Prior: -721.5682 Regularization: 0.0008
Iter: 16080 Training Loss: -734.0454
Negative Log Likelihood: 3.2828 Sigma2 Prior: -737.3290 Regularization: 0.0008
Iter: 16090 Training Loss: -724.8949
Negative Log Likelihood: 3.5400 Sigma2 Prior: -728.4357 Regularization: 0.0008
Iter: 16100 Training Loss: -728.9786
Negative Log Likelihood: 3.6754 Sigma2 Prior: -732.6548 Regularization: 0.0008
Iter: 16110 Training Loss: -729.3062
Negative Log Likelihood: 3.1877 Sigma2 Prior: -732.4948 Regularization: 0.0008
Iter: 16120 Training Loss: -736.8184
Negative Log Likelihood: 3.0957 Sigma2 Prior: -739.9149 Regularization: 0.0008
Iter: 16130 Training Loss: -737.8700
Negative Log Likelihood: 3.2148 Sigma2 Prior: -741.0856 Regularization: 0.0008
Iter: 16140 Training Loss: -733.9983
Negative Log Likelihood: 3.0606 Sigma2 Prior: -737.0598 Regularization: 0.0008
Iter: 16150 Training Loss: -724.1997
Negative Log Likelihood: 3.2509 Sigma2 Prior: -727.4515 Regularization: 0.0008
Iter: 16160 Training Loss: -719.6529
Negative Log Likelihood: 3.7983 Sigma2 Prior: -723.4520 Regularization: 0.0008
Iter: 16170 Training Loss: -722.8753
Negative Log Likelihood: 3.9351 Sigma2 Prior: -726.8113 Regularization: 0.0008
Iter: 16180 Training Loss: -721.5372
Negative Log Likelihood: 3.9026 Sigma2 Prior: -725.4407 Regularization: 0.0008
Iter: 16190 Training Loss: -720.2184
Negative Log Likelihood: 3.4420 Sigma2 Prior: -723.6613 Regularization: 0.0008
Iter: 16200 Training Loss: -733.9259
Negative Log Likelihood: 2.8866 Sigma2 Prior: -736.8134 Regularization: 0.0008
Iter: 16210 Training Loss: -725.6840
Negative Log Likelihood: 3.3137 Sigma2 Prior: -728.9985 Regularization: 0.0008
Iter: 16220 Training Loss: -727.3297
Negative Log Likelihood: 3.7123 Sigma2 Prior: -731.0428 Regularization: 0.0008
Iter: 16230 Training Loss: -718.3382
Negative Log Likelihood: 3.8636 Sigma2 Prior: -722.2026 Regularization: 0.0008
Iter: 16240 Training Loss: -717.9803
Negative Log Likelihood: 3.9748 Sigma2 Prior: -721.9559 Regularization: 0.0008
Iter: 16250 Training Loss: -727.9528
Negative Log Likelihood: 2.8365 Sigma2 Prior: -730.7902 Regularization: 0.0008
Iter: 16260 Training Loss: -717.6725
Negative Log Likelihood: 4.3797 Sigma2 Prior: -722.0531 Regularization: 0.0008
Iter: 16270 Training Loss: -719.6921
Negative Log Likelihood: 4.0921 Sigma2 Prior: -723.7850 Regularization: 0.0008
Iter: 16280 Training Loss: -728.1292
Negative Log Likelihood: 3.0615 Sigma2 Prior: -731.1915 Regularization: 0.0008
Iter: 16290 Training Loss: -718.2881
Negative Log Likelihood: 3.5481 Sigma2 Prior: -721.8370 Regularization: 0.0008
Iter: 16300 Training Loss: -732.1506
Negative Log Likelihood: 3.2795 Sigma2 Prior: -735.4310 Regularization: 0.0008
Iter: 16310 Training Loss: -726.9437
Negative Log Likelihood: 3.1762 Sigma2 Prior: -730.1208 Regularization: 0.0008
Iter: 16320 Training Loss: -717.4825
Negative Log Likelihood: 3.8384 Sigma2 Prior: -721.3218 Regularization: 0.0008
Iter: 16330 Training Loss: -734.8641
Negative Log Likelihood: 3.7775 Sigma2 Prior: -738.6425 Regularization: 0.0008
Iter: 16340 Training Loss: -720.5609
Negative Log Likelihood: 3.9581 Sigma2 Prior: -724.5198 Regularization: 0.0008
Iter: 16350 Training Loss: -728.5635
Negative Log Likelihood: 3.1495 Sigma2 Prior: -731.7138 Regularization: 0.0008
Iter: 16360 Training Loss: -719.9611
Negative Log Likelihood: 3.6279 Sigma2 Prior: -723.5898 Regularization: 0.0008
Iter: 16370 Training Loss: -731.4055
Negative Log Likelihood: 3.7271 Sigma2 Prior: -735.1334 Regularization: 0.0008
Iter: 16380 Training Loss: -730.1725
Negative Log Likelihood: 3.2840 Sigma2 Prior: -733.4574 Regularization: 0.0008
Iter: 16390 Training Loss: -736.0834
Negative Log Likelihood: 2.1566 Sigma2 Prior: -738.2408 Regularization: 0.0008
Iter: 16400 Training Loss: -723.3711
Negative Log Likelihood: 3.6774 Sigma2 Prior: -727.0493 Regularization: 0.0008
Iter: 16410 Training Loss: -725.9747
Negative Log Likelihood: 3.6716 Sigma2 Prior: -729.6472 Regularization: 0.0008
Iter: 16420 Training Loss: -719.8290
Negative Log Likelihood: 3.5327 Sigma2 Prior: -723.3625 Regularization: 0.0008
Iter: 16430 Training Loss: -718.3555
Negative Log Likelihood: 3.9174 Sigma2 Prior: -722.2738 Regularization: 0.0008
Iter: 16440 Training Loss: -730.2421
Negative Log Likelihood: 3.5088 Sigma2 Prior: -733.7517 Regularization: 0.0008
Iter: 16450 Training Loss: -726.2040
Negative Log Likelihood: 3.6267 Sigma2 Prior: -729.8315 Regularization: 0.0008
Iter: 16460 Training Loss: -727.2372
Negative Log Likelihood: 3.9575 Sigma2 Prior: -731.1956 Regularization: 0.0008
Iter: 16470 Training Loss: -710.2403
Negative Log Likelihood: 3.8457 Sigma2 Prior: -714.0869 Regularization: 0.0008
Iter: 16480 Training Loss: -722.2652
Negative Log Likelihood: 4.3262 Sigma2 Prior: -726.5923 Regularization: 0.0008
Iter: 16490 Training Loss: -736.1474
Negative Log Likelihood: 3.1934 Sigma2 Prior: -739.3417 Regularization: 0.0008
Iter: 16500 Training Loss: -721.6961
Negative Log Likelihood: 3.8968 Sigma2 Prior: -725.5938 Regularization: 0.0008
Iter: 16510 Training Loss: -719.4877
Negative Log Likelihood: 4.0092 Sigma2 Prior: -723.4977 Regularization: 0.0008
Iter: 16520 Training Loss: -725.8964
Negative Log Likelihood: 3.8672 Sigma2 Prior: -729.7645 Regularization: 0.0008
Iter: 16530 Training Loss: -730.4890
Negative Log Likelihood: 3.0529 Sigma2 Prior: -733.5427 Regularization: 0.0008
Iter: 16540 Training Loss: -729.6383
Negative Log Likelihood: 3.8254 Sigma2 Prior: -733.4645 Regularization: 0.0008
Iter: 16550 Training Loss: -723.8704
Negative Log Likelihood: 3.8946 Sigma2 Prior: -727.7658 Regularization: 0.0008
Iter: 16560 Training Loss: -729.6353
Negative Log Likelihood: 3.9643 Sigma2 Prior: -733.6004 Regularization: 0.0008
Iter: 16570 Training Loss: -717.8948
Negative Log Likelihood: 3.9148 Sigma2 Prior: -721.8105 Regularization: 0.0008
Iter: 16580 Training Loss: -730.8329
Negative Log Likelihood: 3.3329 Sigma2 Prior: -734.1666 Regularization: 0.0008
Iter: 16590 Training Loss: -718.6602
Negative Log Likelihood: 4.2642 Sigma2 Prior: -722.9252 Regularization: 0.0008
Iter: 16600 Training Loss: -730.9871
Negative Log Likelihood: 3.7777 Sigma2 Prior: -734.7656 Regularization: 0.0008
Iter: 16610 Training Loss: -718.5704
Negative Log Likelihood: 3.8351 Sigma2 Prior: -722.4064 Regularization: 0.0009
Iter: 16620 Training Loss: -727.9827
Negative Log Likelihood: 3.5226 Sigma2 Prior: -731.5061 Regularization: 0.0009
Iter: 16630 Training Loss: -720.6542
Negative Log Likelihood: 3.0532 Sigma2 Prior: -723.7083 Regularization: 0.0009
Iter: 16640 Training Loss: -727.5668
Negative Log Likelihood: 3.8831 Sigma2 Prior: -731.4507 Regularization: 0.0009
Iter: 16650 Training Loss: -721.3762
Negative Log Likelihood: 4.1892 Sigma2 Prior: -725.5662 Regularization: 0.0009
Iter: 16660 Training Loss: -726.6713
Negative Log Likelihood: 3.1243 Sigma2 Prior: -729.7964 Regularization: 0.0009
Iter: 16670 Training Loss: -715.0406
Negative Log Likelihood: 3.1743 Sigma2 Prior: -718.2158 Regularization: 0.0009
Iter: 16680 Training Loss: -724.8095
Negative Log Likelihood: 3.3329 Sigma2 Prior: -728.1432 Regularization: 0.0009
Iter: 16690 Training Loss: -728.3438
Negative Log Likelihood: 3.3255 Sigma2 Prior: -731.6702 Regularization: 0.0009
Iter: 16700 Training Loss: -732.0031
Negative Log Likelihood: 3.6855 Sigma2 Prior: -735.6894 Regularization: 0.0009
Iter: 16710 Training Loss: -734.9612
Negative Log Likelihood: 3.2042 Sigma2 Prior: -738.1662 Regularization: 0.0009
Iter: 16720 Training Loss: -728.4468
Negative Log Likelihood: 3.4720 Sigma2 Prior: -731.9197 Regularization: 0.0009
Iter: 16730 Training Loss: -725.1144
Negative Log Likelihood: 4.2151 Sigma2 Prior: -729.3304 Regularization: 0.0009
Iter: 16740 Training Loss: -725.8315
Negative Log Likelihood: 3.3394 Sigma2 Prior: -729.1718 Regularization: 0.0009
Iter: 16750 Training Loss: -725.6897
Negative Log Likelihood: 3.5148 Sigma2 Prior: -729.2053 Regularization: 0.0009
Iter: 16760 Training Loss: -725.5356
Negative Log Likelihood: 3.8705 Sigma2 Prior: -729.4070 Regularization: 0.0009
Iter: 16770 Training Loss: -729.3804
Negative Log Likelihood: 3.1704 Sigma2 Prior: -732.5516 Regularization: 0.0009
Iter: 16780 Training Loss: -724.5652
Negative Log Likelihood: 3.9787 Sigma2 Prior: -728.5448 Regularization: 0.0009
Iter: 16790 Training Loss: -719.5627
Negative Log Likelihood: 3.5111 Sigma2 Prior: -723.0747 Regularization: 0.0009
Iter: 16800 Training Loss: -724.6509
Negative Log Likelihood: 3.1788 Sigma2 Prior: -727.8305 Regularization: 0.0009
Iter: 16810 Training Loss: -725.6794
Negative Log Likelihood: 3.8595 Sigma2 Prior: -729.5397 Regularization: 0.0009
Iter: 16820 Training Loss: -723.6546
Negative Log Likelihood: 3.4747 Sigma2 Prior: -727.1301 Regularization: 0.0009
Iter: 16830 Training Loss: -724.4540
Negative Log Likelihood: 3.2829 Sigma2 Prior: -727.7378 Regularization: 0.0009
Iter: 16840 Training Loss: -715.4738
Negative Log Likelihood: 3.7961 Sigma2 Prior: -719.2707 Regularization: 0.0009
Iter: 16850 Training Loss: -727.4464
Negative Log Likelihood: 3.1061 Sigma2 Prior: -730.5533 Regularization: 0.0009
Iter: 16860 Training Loss: -729.3207
Negative Log Likelihood: 3.3379 Sigma2 Prior: -732.6594 Regularization: 0.0009
Iter: 16870 Training Loss: -727.1580
Negative Log Likelihood: 4.0085 Sigma2 Prior: -731.1673 Regularization: 0.0009
Iter: 16880 Training Loss: -725.1492
Negative Log Likelihood: 3.6238 Sigma2 Prior: -728.7738 Regularization: 0.0009
Iter: 16890 Training Loss: -723.8990
Negative Log Likelihood: 2.9874 Sigma2 Prior: -726.8873 Regularization: 0.0009
Iter: 16900 Training Loss: -716.7350
Negative Log Likelihood: 3.9571 Sigma2 Prior: -720.6929 Regularization: 0.0009
Iter: 16910 Training Loss: -722.8524
Negative Log Likelihood: 4.5152 Sigma2 Prior: -727.3684 Regularization: 0.0009
Iter: 16920 Training Loss: -728.4135
Negative Log Likelihood: 3.5873 Sigma2 Prior: -732.0016 Regularization: 0.0009
Iter: 16930 Training Loss: -728.4631
Negative Log Likelihood: 3.2696 Sigma2 Prior: -731.7336 Regularization: 0.0009
Iter: 16940 Training Loss: -715.4149
Negative Log Likelihood: 4.2550 Sigma2 Prior: -719.6708 Regularization: 0.0009
Iter: 16950 Training Loss: -722.7590
Negative Log Likelihood: 4.3936 Sigma2 Prior: -727.1534 Regularization: 0.0009
Iter: 16960 Training Loss: -731.8639
Negative Log Likelihood: 2.9442 Sigma2 Prior: -734.8090 Regularization: 0.0009
Iter: 16970 Training Loss: -719.4136
Negative Log Likelihood: 3.7028 Sigma2 Prior: -723.1173 Regularization: 0.0009
Iter: 16980 Training Loss: -729.0106
Negative Log Likelihood: 3.9501 Sigma2 Prior: -732.9615 Regularization: 0.0009
Iter: 16990 Training Loss: -735.7848
Negative Log Likelihood: 2.9939 Sigma2 Prior: -738.7795 Regularization: 0.0009
Iter: 17000 Training Loss: -728.9574
Negative Log Likelihood: 3.4463 Sigma2 Prior: -732.4045 Regularization: 0.0009
Iter: 17010 Training Loss: -716.0005
Negative Log Likelihood: 4.1503 Sigma2 Prior: -720.1517 Regularization: 0.0009
Iter: 17020 Training Loss: -719.9755
Negative Log Likelihood: 3.5661 Sigma2 Prior: -723.5425 Regularization: 0.0009
Iter: 17030 Training Loss: -727.5252
Negative Log Likelihood: 3.5514 Sigma2 Prior: -731.0775 Regularization: 0.0009
Iter: 17040 Training Loss: -725.6323
Negative Log Likelihood: 3.6470 Sigma2 Prior: -729.2802 Regularization: 0.0009
Iter: 17050 Training Loss: -729.1443
Negative Log Likelihood: 3.9137 Sigma2 Prior: -733.0589 Regularization: 0.0009
Iter: 17060 Training Loss: -710.4373
Negative Log Likelihood: 4.0538 Sigma2 Prior: -714.4919 Regularization: 0.0009
Iter: 17070 Training Loss: -727.3793
Negative Log Likelihood: 3.6164 Sigma2 Prior: -730.9965 Regularization: 0.0009
Iter: 17080 Training Loss: -718.8119
Negative Log Likelihood: 3.1142 Sigma2 Prior: -721.9270 Regularization: 0.0009
Iter: 17090 Training Loss: -716.5769
Negative Log Likelihood: 3.1303 Sigma2 Prior: -719.7081 Regularization: 0.0009
Iter: 17100 Training Loss: -718.2307
Negative Log Likelihood: 3.3223 Sigma2 Prior: -721.5539 Regularization: 0.0009
Iter: 17110 Training Loss: -735.1651
Negative Log Likelihood: 4.3085 Sigma2 Prior: -739.4745 Regularization: 0.0009
Iter: 17120 Training Loss: -720.7004
Negative Log Likelihood: 3.4594 Sigma2 Prior: -724.1606 Regularization: 0.0009
Iter: 17130 Training Loss: -722.3308
Negative Log Likelihood: 3.3392 Sigma2 Prior: -725.6708 Regularization: 0.0009
Iter: 17140 Training Loss: -726.7682
Negative Log Likelihood: 3.3477 Sigma2 Prior: -730.1168 Regularization: 0.0009
Iter: 17150 Training Loss: -725.0822
Negative Log Likelihood: 3.0613 Sigma2 Prior: -728.1444 Regularization: 0.0009
Iter: 17160 Training Loss: -720.6433
Negative Log Likelihood: 3.9608 Sigma2 Prior: -724.6050 Regularization: 0.0009
Iter: 17170 Training Loss: -727.5533
Negative Log Likelihood: 3.7685 Sigma2 Prior: -731.3226 Regularization: 0.0009
Iter: 17180 Training Loss: -724.7241
Negative Log Likelihood: 3.3937 Sigma2 Prior: -728.1187 Regularization: 0.0009
Iter: 17190 Training Loss: -727.9631
Negative Log Likelihood: 3.7155 Sigma2 Prior: -731.6794 Regularization: 0.0009
Iter: 17200 Training Loss: -727.8182
Negative Log Likelihood: 3.1174 Sigma2 Prior: -730.9364 Regularization: 0.0009
Iter: 17210 Training Loss: -729.6329
Negative Log Likelihood: 3.6426 Sigma2 Prior: -733.2764 Regularization: 0.0009
Iter: 17220 Training Loss: -728.8141
Negative Log Likelihood: 3.5852 Sigma2 Prior: -732.4001 Regularization: 0.0009
Iter: 17230 Training Loss: -720.8876
Negative Log Likelihood: 4.0506 Sigma2 Prior: -724.9391 Regularization: 0.0009
Iter: 17240 Training Loss: -715.4130
Negative Log Likelihood: 4.6062 Sigma2 Prior: -720.0200 Regularization: 0.0009
Iter: 17250 Training Loss: -737.1339
Negative Log Likelihood: 3.1388 Sigma2 Prior: -740.2735 Regularization: 0.0009
Iter: 17260 Training Loss: -719.6909
Negative Log Likelihood: 3.6789 Sigma2 Prior: -723.3707 Regularization: 0.0009
Iter: 17270 Training Loss: -731.8199
Negative Log Likelihood: 2.9750 Sigma2 Prior: -734.7958 Regularization: 0.0009
Iter: 17280 Training Loss: -728.7507
Negative Log Likelihood: 3.6023 Sigma2 Prior: -732.3538 Regularization: 0.0009
Iter: 17290 Training Loss: -712.9933
Negative Log Likelihood: 3.5133 Sigma2 Prior: -716.5074 Regularization: 0.0009
Iter: 17300 Training Loss: -723.7191
Negative Log Likelihood: 3.8201 Sigma2 Prior: -727.5401 Regularization: 0.0009
Iter: 17310 Training Loss: -720.6963
Negative Log Likelihood: 3.9951 Sigma2 Prior: -724.6923 Regularization: 0.0009
Iter: 17320 Training Loss: -716.1122
Negative Log Likelihood: 3.8496 Sigma2 Prior: -719.9626 Regularization: 0.0009
Iter: 17330 Training Loss: -721.4225
Negative Log Likelihood: 3.5149 Sigma2 Prior: -724.9384 Regularization: 0.0009
Iter: 17340 Training Loss: -723.4604
Negative Log Likelihood: 3.7626 Sigma2 Prior: -727.2239 Regularization: 0.0009
Iter: 17350 Training Loss: -724.8270
Negative Log Likelihood: 3.4148 Sigma2 Prior: -728.2427 Regularization: 0.0009
Iter: 17360 Training Loss: -731.3526
Negative Log Likelihood: 3.0831 Sigma2 Prior: -734.4365 Regularization: 0.0009
Iter: 17370 Training Loss: -725.1655
Negative Log Likelihood: 3.2413 Sigma2 Prior: -728.4076 Regularization: 0.0009
Iter: 17380 Training Loss: -726.0651
Negative Log Likelihood: 4.0174 Sigma2 Prior: -730.0833 Regularization: 0.0009
Iter: 17390 Training Loss: -725.4409
Negative Log Likelihood: 3.5427 Sigma2 Prior: -728.9845 Regularization: 0.0009
Iter: 17400 Training Loss: -731.7408
Negative Log Likelihood: 3.2159 Sigma2 Prior: -734.9576 Regularization: 0.0009
Iter: 17410 Training Loss: -724.0778
Negative Log Likelihood: 3.4980 Sigma2 Prior: -727.5766 Regularization: 0.0009
Iter: 17420 Training Loss: -737.1243
Negative Log Likelihood: 3.5476 Sigma2 Prior: -740.6727 Regularization: 0.0009
Iter: 17430 Training Loss: -719.0543
Negative Log Likelihood: 3.3710 Sigma2 Prior: -722.4261 Regularization: 0.0009
Iter: 17440 Training Loss: -729.7393
Negative Log Likelihood: 3.0159 Sigma2 Prior: -732.7560 Regularization: 0.0009
Iter: 17450 Training Loss: -725.1411
Negative Log Likelihood: 3.5377 Sigma2 Prior: -728.6797 Regularization: 0.0009
Iter: 17460 Training Loss: -719.4324
Negative Log Likelihood: 4.1754 Sigma2 Prior: -723.6087 Regularization: 0.0009
Iter: 17470 Training Loss: -727.9687
Negative Log Likelihood: 4.0434 Sigma2 Prior: -732.0130 Regularization: 0.0009
Iter: 17480 Training Loss: -718.5901
Negative Log Likelihood: 3.5173 Sigma2 Prior: -722.1083 Regularization: 0.0009
Iter: 17490 Training Loss: -724.3060
Negative Log Likelihood: 2.9763 Sigma2 Prior: -727.2832 Regularization: 0.0009
Iter: 17500 Training Loss: -727.2440
Negative Log Likelihood: 3.6279 Sigma2 Prior: -730.8727 Regularization: 0.0009
Iter: 17510 Training Loss: -726.9304
Negative Log Likelihood: 3.3667 Sigma2 Prior: -730.2980 Regularization: 0.0009
Iter: 17520 Training Loss: -721.6332
Negative Log Likelihood: 3.9939 Sigma2 Prior: -725.6281 Regularization: 0.0009
Iter: 17530 Training Loss: -728.9670
Negative Log Likelihood: 3.6029 Sigma2 Prior: -732.5707 Regularization: 0.0009
Iter: 17540 Training Loss: -722.6687
Negative Log Likelihood: 3.3273 Sigma2 Prior: -725.9968 Regularization: 0.0009
Iter: 17550 Training Loss: -728.5245
Negative Log Likelihood: 3.5199 Sigma2 Prior: -732.0452 Regularization: 0.0009
Iter: 17560 Training Loss: -724.8978
Negative Log Likelihood: 3.5547 Sigma2 Prior: -728.4534 Regularization: 0.0009
Iter: 17570 Training Loss: -733.5684
Negative Log Likelihood: 3.4763 Sigma2 Prior: -737.0455 Regularization: 0.0009
Iter: 17580 Training Loss: -728.2650
Negative Log Likelihood: 3.7790 Sigma2 Prior: -732.0449 Regularization: 0.0009
Iter: 17590 Training Loss: -725.3926
Negative Log Likelihood: 3.7079 Sigma2 Prior: -729.1013 Regularization: 0.0009
Iter: 17600 Training Loss: -726.3294
Negative Log Likelihood: 3.6353 Sigma2 Prior: -729.9656 Regularization: 0.0009
Iter: 17610 Training Loss: -729.0604
Negative Log Likelihood: 3.4232 Sigma2 Prior: -732.4844 Regularization: 0.0009
Iter: 17620 Training Loss: -717.5251
Negative Log Likelihood: 3.9083 Sigma2 Prior: -721.4343 Regularization: 0.0009
Iter: 17630 Training Loss: -732.4001
Negative Log Likelihood: 2.2788 Sigma2 Prior: -734.6797 Regularization: 0.0009
Iter: 17640 Training Loss: -724.2581
Negative Log Likelihood: 3.7576 Sigma2 Prior: -728.0165 Regularization: 0.0009
Iter: 17650 Training Loss: -718.7062
Negative Log Likelihood: 3.6598 Sigma2 Prior: -722.3669 Regularization: 0.0009
Iter: 17660 Training Loss: -717.9631
Negative Log Likelihood: 4.4967 Sigma2 Prior: -722.4607 Regularization: 0.0009
Iter: 17670 Training Loss: -722.7197
Negative Log Likelihood: 4.0437 Sigma2 Prior: -726.7643 Regularization: 0.0009
Iter: 17680 Training Loss: -712.9075
Negative Log Likelihood: 3.8473 Sigma2 Prior: -716.7557 Regularization: 0.0009
Iter: 17690 Training Loss: -720.3958
Negative Log Likelihood: 3.2146 Sigma2 Prior: -723.6113 Regularization: 0.0009
Iter: 17700 Training Loss: -729.2010
Negative Log Likelihood: 3.1389 Sigma2 Prior: -732.3408 Regularization: 0.0009
Iter: 17710 Training Loss: -719.8483
Negative Log Likelihood: 3.7674 Sigma2 Prior: -723.6165 Regularization: 0.0009
Iter: 17720 Training Loss: -718.6035
Negative Log Likelihood: 4.0243 Sigma2 Prior: -722.6287 Regularization: 0.0009
Iter: 17730 Training Loss: -724.8187
Negative Log Likelihood: 3.2979 Sigma2 Prior: -728.1174 Regularization: 0.0009
Iter: 17740 Training Loss: -716.2338
Negative Log Likelihood: 3.4429 Sigma2 Prior: -719.6776 Regularization: 0.0009
Iter: 17750 Training Loss: -721.3574
Negative Log Likelihood: 3.1747 Sigma2 Prior: -724.5329 Regularization: 0.0009
Iter: 17760 Training Loss: -731.9494
Negative Log Likelihood: 2.7413 Sigma2 Prior: -734.6915 Regularization: 0.0009
Iter: 17770 Training Loss: -728.0966
Negative Log Likelihood: 3.3194 Sigma2 Prior: -731.4169 Regularization: 0.0009
Iter: 17780 Training Loss: -723.1055
Negative Log Likelihood: 3.3001 Sigma2 Prior: -726.4064 Regularization: 0.0009
Iter: 17790 Training Loss: -727.3509
Negative Log Likelihood: 3.7568 Sigma2 Prior: -731.1086 Regularization: 0.0009
Iter: 17800 Training Loss: -728.6586
Negative Log Likelihood: 3.2258 Sigma2 Prior: -731.8853 Regularization: 0.0009
Iter: 17810 Training Loss: -717.3907
Negative Log Likelihood: 4.2962 Sigma2 Prior: -721.6878 Regularization: 0.0009
Iter: 17820 Training Loss: -731.3647
Negative Log Likelihood: 3.4019 Sigma2 Prior: -734.7674 Regularization: 0.0009
Iter: 17830 Training Loss: -737.0963
Negative Log Likelihood: 2.9779 Sigma2 Prior: -740.0750 Regularization: 0.0009
Iter: 17840 Training Loss: -722.1484
Negative Log Likelihood: 3.5215 Sigma2 Prior: -725.6707 Regularization: 0.0009
Iter: 17850 Training Loss: -722.4923
Negative Log Likelihood: 3.6518 Sigma2 Prior: -726.1449 Regularization: 0.0009
Iter: 17860 Training Loss: -729.6787
Negative Log Likelihood: 2.8122 Sigma2 Prior: -732.4918 Regularization: 0.0009
Iter: 17870 Training Loss: -721.7058
Negative Log Likelihood: 3.3872 Sigma2 Prior: -725.0939 Regularization: 0.0009
Iter: 17880 Training Loss: -730.2492
Negative Log Likelihood: 3.3243 Sigma2 Prior: -733.5743 Regularization: 0.0009
Iter: 17890 Training Loss: -718.3075
Negative Log Likelihood: 4.3061 Sigma2 Prior: -722.6144 Regularization: 0.0009
Iter: 17900 Training Loss: -726.4326
Negative Log Likelihood: 3.4524 Sigma2 Prior: -729.8859 Regularization: 0.0009
Iter: 17910 Training Loss: -723.5603
Negative Log Likelihood: 3.7017 Sigma2 Prior: -727.2629 Regularization: 0.0009
Iter: 17920 Training Loss: -732.5324
Negative Log Likelihood: 3.4723 Sigma2 Prior: -736.0055 Regularization: 0.0009
Iter: 17930 Training Loss: -729.5322
Negative Log Likelihood: 3.0920 Sigma2 Prior: -732.6251 Regularization: 0.0009
Iter: 17940 Training Loss: -722.1837
Negative Log Likelihood: 3.2487 Sigma2 Prior: -725.4332 Regularization: 0.0009
Iter: 17950 Training Loss: -727.9163
Negative Log Likelihood: 3.9048 Sigma2 Prior: -731.8219 Regularization: 0.0009
Iter: 17960 Training Loss: -726.1028
Negative Log Likelihood: 4.1435 Sigma2 Prior: -730.2471 Regularization: 0.0009
Iter: 17970 Training Loss: -728.0087
Negative Log Likelihood: 3.9708 Sigma2 Prior: -731.9803 Regularization: 0.0009
Iter: 17980 Training Loss: -730.5536
Negative Log Likelihood: 3.0489 Sigma2 Prior: -733.6034 Regularization: 0.0009
Iter: 17990 Training Loss: -723.5605
Negative Log Likelihood: 3.0808 Sigma2 Prior: -726.6422 Regularization: 0.0009
Iter: 18000 Training Loss: -722.7422
Negative Log Likelihood: 4.0678 Sigma2 Prior: -726.8109 Regularization: 0.0009
Iter: 18010 Training Loss: -721.6591
Negative Log Likelihood: 3.3477 Sigma2 Prior: -725.0077 Regularization: 0.0009
Iter: 18020 Training Loss: -725.8024
Negative Log Likelihood: 3.3229 Sigma2 Prior: -729.1261 Regularization: 0.0009
Iter: 18030 Training Loss: -736.6490
Negative Log Likelihood: 3.2541 Sigma2 Prior: -739.9040 Regularization: 0.0009
Iter: 18040 Training Loss: -723.0681
Negative Log Likelihood: 3.4095 Sigma2 Prior: -726.4784 Regularization: 0.0009
Iter: 18050 Training Loss: -726.7964
Negative Log Likelihood: 3.1075 Sigma2 Prior: -729.9048 Regularization: 0.0009
Iter: 18060 Training Loss: -730.5295
Negative Log Likelihood: 3.1752 Sigma2 Prior: -733.7056 Regularization: 0.0009
Iter: 18070 Training Loss: -715.9991
Negative Log Likelihood: 4.0020 Sigma2 Prior: -720.0020 Regularization: 0.0009
Iter: 18080 Training Loss: -731.3111
Negative Log Likelihood: 3.3005 Sigma2 Prior: -734.6124 Regularization: 0.0009
Iter: 18090 Training Loss: -726.7186
Negative Log Likelihood: 3.0684 Sigma2 Prior: -729.7878 Regularization: 0.0009
Iter: 18100 Training Loss: -725.0937
Negative Log Likelihood: 2.6439 Sigma2 Prior: -727.7385 Regularization: 0.0009
Iter: 18110 Training Loss: -731.4318
Negative Log Likelihood: 2.9641 Sigma2 Prior: -734.3968 Regularization: 0.0009
Iter: 18120 Training Loss: -719.0203
Negative Log Likelihood: 3.0579 Sigma2 Prior: -722.0790 Regularization: 0.0009
Iter: 18130 Training Loss: -735.2003
Negative Log Likelihood: 3.3053 Sigma2 Prior: -738.5064 Regularization: 0.0009
Iter: 18140 Training Loss: -723.2570
Negative Log Likelihood: 3.7483 Sigma2 Prior: -727.0062 Regularization: 0.0009
Iter: 18150 Training Loss: -715.6596
Negative Log Likelihood: 3.4673 Sigma2 Prior: -719.1277 Regularization: 0.0009
Iter: 18160 Training Loss: -729.8047
Negative Log Likelihood: 4.0329 Sigma2 Prior: -733.8384 Regularization: 0.0009
Iter: 18170 Training Loss: -733.0231
Negative Log Likelihood: 2.9725 Sigma2 Prior: -735.9964 Regularization: 0.0009
Iter: 18180 Training Loss: -730.6726
Negative Log Likelihood: 3.2830 Sigma2 Prior: -733.9564 Regularization: 0.0009
Iter: 18190 Training Loss: -727.7103
Negative Log Likelihood: 3.7366 Sigma2 Prior: -731.4478 Regularization: 0.0009
Iter: 18200 Training Loss: -723.6301
Negative Log Likelihood: 3.7258 Sigma2 Prior: -727.3568 Regularization: 0.0009
Iter: 18210 Training Loss: -720.7969
Negative Log Likelihood: 3.8962 Sigma2 Prior: -724.6940 Regularization: 0.0009
Iter: 18220 Training Loss: -727.0609
Negative Log Likelihood: 3.1796 Sigma2 Prior: -730.2414 Regularization: 0.0009
Iter: 18230 Training Loss: -730.5322
Negative Log Likelihood: 3.4885 Sigma2 Prior: -734.0216 Regularization: 0.0009
Iter: 18240 Training Loss: -723.6649
Negative Log Likelihood: 3.3813 Sigma2 Prior: -727.0471 Regularization: 0.0009
Iter: 18250 Training Loss: -732.3071
Negative Log Likelihood: 3.4511 Sigma2 Prior: -735.7590 Regularization: 0.0009
Iter: 18260 Training Loss: -728.3463
Negative Log Likelihood: 3.5084 Sigma2 Prior: -731.8555 Regularization: 0.0009
Iter: 18270 Training Loss: -722.5145
Negative Log Likelihood: 3.4872 Sigma2 Prior: -726.0027 Regularization: 0.0009
Iter: 18280 Training Loss: -722.6873
Negative Log Likelihood: 3.7080 Sigma2 Prior: -726.3962 Regularization: 0.0009
Iter: 18290 Training Loss: -731.3375
Negative Log Likelihood: 2.8188 Sigma2 Prior: -734.1573 Regularization: 0.0009
Iter: 18300 Training Loss: -716.3983
Negative Log Likelihood: 4.4600 Sigma2 Prior: -720.8593 Regularization: 0.0009
Iter: 18310 Training Loss: -721.8479
Negative Log Likelihood: 4.1418 Sigma2 Prior: -725.9906 Regularization: 0.0009
Iter: 18320 Training Loss: -726.2396
Negative Log Likelihood: 2.9520 Sigma2 Prior: -729.1926 Regularization: 0.0009
Iter: 18330 Training Loss: -729.7220
Negative Log Likelihood: 3.7198 Sigma2 Prior: -733.4427 Regularization: 0.0009
Iter: 18340 Training Loss: -717.2514
Negative Log Likelihood: 3.7263 Sigma2 Prior: -720.9786 Regularization: 0.0009
Iter: 18350 Training Loss: -721.9741
Negative Log Likelihood: 4.0273 Sigma2 Prior: -726.0024 Regularization: 0.0009
Iter: 18360 Training Loss: -724.9548
Negative Log Likelihood: 3.5613 Sigma2 Prior: -728.5171 Regularization: 0.0009
Iter: 18370 Training Loss: -730.3541
Negative Log Likelihood: 3.8970 Sigma2 Prior: -734.2520 Regularization: 0.0009
Iter: 18380 Training Loss: -724.5443
Negative Log Likelihood: 3.5227 Sigma2 Prior: -728.0679 Regularization: 0.0009
Iter: 18390 Training Loss: -727.5612
Negative Log Likelihood: 3.8676 Sigma2 Prior: -731.4297 Regularization: 0.0009
Iter: 18400 Training Loss: -720.9005
Negative Log Likelihood: 3.8609 Sigma2 Prior: -724.7623 Regularization: 0.0009
Iter: 18410 Training Loss: -733.0146
Negative Log Likelihood: 2.8709 Sigma2 Prior: -735.8865 Regularization: 0.0009
Iter: 18420 Training Loss: -725.4615
Negative Log Likelihood: 3.7473 Sigma2 Prior: -729.2098 Regularization: 0.0009
Iter: 18430 Training Loss: -739.8948
Negative Log Likelihood: 3.2017 Sigma2 Prior: -743.0974 Regularization: 0.0009
Iter: 18440 Training Loss: -725.7770
Negative Log Likelihood: 3.8088 Sigma2 Prior: -729.5867 Regularization: 0.0009
Iter: 18450 Training Loss: -726.0816
Negative Log Likelihood: 3.4727 Sigma2 Prior: -729.5552 Regularization: 0.0009
Iter: 18460 Training Loss: -721.1007
Negative Log Likelihood: 4.3283 Sigma2 Prior: -725.4299 Regularization: 0.0009
Iter: 18470 Training Loss: -716.9252
Negative Log Likelihood: 3.5513 Sigma2 Prior: -720.4774 Regularization: 0.0009
Iter: 18480 Training Loss: -723.2635
Negative Log Likelihood: 3.2795 Sigma2 Prior: -726.5439 Regularization: 0.0009
Iter: 18490 Training Loss: -736.3201
Negative Log Likelihood: 3.0107 Sigma2 Prior: -739.3317 Regularization: 0.0009
Iter: 18500 Training Loss: -722.7687
Negative Log Likelihood: 3.8806 Sigma2 Prior: -726.6503 Regularization: 0.0009
Iter: 18510 Training Loss: -720.6488
Negative Log Likelihood: 4.0466 Sigma2 Prior: -724.6963 Regularization: 0.0009
Iter: 18520 Training Loss: -715.7730
Negative Log Likelihood: 3.6419 Sigma2 Prior: -719.4158 Regularization: 0.0009
Iter: 18530 Training Loss: -720.5031
Negative Log Likelihood: 3.3576 Sigma2 Prior: -723.8616 Regularization: 0.0009
Iter: 18540 Training Loss: -718.6411
Negative Log Likelihood: 4.4016 Sigma2 Prior: -723.0436 Regularization: 0.0009
Iter: 18550 Training Loss: -730.5335
Negative Log Likelihood: 3.2203 Sigma2 Prior: -733.7548 Regularization: 0.0009
Iter: 18560 Training Loss: -738.2352
Negative Log Likelihood: 3.0599 Sigma2 Prior: -741.2960 Regularization: 0.0009
Iter: 18570 Training Loss: -717.2802
Negative Log Likelihood: 3.3132 Sigma2 Prior: -720.5943 Regularization: 0.0009
Iter: 18580 Training Loss: -730.0722
Negative Log Likelihood: 3.6486 Sigma2 Prior: -733.7217 Regularization: 0.0009
Iter: 18590 Training Loss: -727.7148
Negative Log Likelihood: 4.1452 Sigma2 Prior: -731.8609 Regularization: 0.0009
Iter: 18600 Training Loss: -723.9374
Negative Log Likelihood: 3.9961 Sigma2 Prior: -727.9344 Regularization: 0.0009
Iter: 18610 Training Loss: -728.9448
Negative Log Likelihood: 3.3668 Sigma2 Prior: -732.3124 Regularization: 0.0009
Iter: 18620 Training Loss: -730.1985
Negative Log Likelihood: 3.3684 Sigma2 Prior: -733.5678 Regularization: 0.0009
Iter: 18630 Training Loss: -734.1602
Negative Log Likelihood: 2.4652 Sigma2 Prior: -736.6263 Regularization: 0.0009
Iter: 18640 Training Loss: -724.9996
Negative Log Likelihood: 4.1637 Sigma2 Prior: -729.1643 Regularization: 0.0009
Iter: 18650 Training Loss: -717.0864
Negative Log Likelihood: 4.4523 Sigma2 Prior: -721.5396 Regularization: 0.0009
Iter: 18660 Training Loss: -726.5927
Negative Log Likelihood: 3.7421 Sigma2 Prior: -730.3358 Regularization: 0.0009
Iter: 18670 Training Loss: -730.1684
Negative Log Likelihood: 3.7006 Sigma2 Prior: -733.8699 Regularization: 0.0009
Iter: 18680 Training Loss: -723.7822
Negative Log Likelihood: 3.3940 Sigma2 Prior: -727.1771 Regularization: 0.0009
Iter: 18690 Training Loss: -727.3087
Negative Log Likelihood: 3.5405 Sigma2 Prior: -730.8500 Regularization: 0.0009
Iter: 18700 Training Loss: -712.7772
Negative Log Likelihood: 4.4598 Sigma2 Prior: -717.2379 Regularization: 0.0009
Iter: 18710 Training Loss: -729.6649
Negative Log Likelihood: 3.6878 Sigma2 Prior: -733.3535 Regularization: 0.0009
Iter: 18720 Training Loss: -730.0012
Negative Log Likelihood: 2.9319 Sigma2 Prior: -732.9341 Regularization: 0.0009
Iter: 18730 Training Loss: -723.5234
Negative Log Likelihood: 3.0886 Sigma2 Prior: -726.6129 Regularization: 0.0009
Iter: 18740 Training Loss: -725.0034
Negative Log Likelihood: 3.3629 Sigma2 Prior: -728.3672 Regularization: 0.0009
Iter: 18750 Training Loss: -729.1130
Negative Log Likelihood: 3.6020 Sigma2 Prior: -732.7159 Regularization: 0.0009
Iter: 18760 Training Loss: -721.8073
Negative Log Likelihood: 3.1322 Sigma2 Prior: -724.9404 Regularization: 0.0009
Iter: 18770 Training Loss: -729.0444
Negative Log Likelihood: 2.9291 Sigma2 Prior: -731.9744 Regularization: 0.0009
Iter: 18780 Training Loss: -720.4155
Negative Log Likelihood: 3.4577 Sigma2 Prior: -723.8741 Regularization: 0.0009
Iter: 18790 Training Loss: -724.4778
Negative Log Likelihood: 4.1772 Sigma2 Prior: -728.6559 Regularization: 0.0009
Iter: 18800 Training Loss: -723.5386
Negative Log Likelihood: 3.4259 Sigma2 Prior: -726.9654 Regularization: 0.0009
Iter: 18810 Training Loss: -725.4504
Negative Log Likelihood: 3.3177 Sigma2 Prior: -728.7690 Regularization: 0.0009
Iter: 18820 Training Loss: -726.7651
Negative Log Likelihood: 2.4854 Sigma2 Prior: -729.2515 Regularization: 0.0009
Iter: 18830 Training Loss: -737.9675
Negative Log Likelihood: 2.9460 Sigma2 Prior: -740.9144 Regularization: 0.0009
Iter: 18840 Training Loss: -716.5260
Negative Log Likelihood: 3.6460 Sigma2 Prior: -720.1729 Regularization: 0.0009
Iter: 18850 Training Loss: -721.5054
Negative Log Likelihood: 3.9160 Sigma2 Prior: -725.4223 Regularization: 0.0009
Iter: 18860 Training Loss: -731.4354
Negative Log Likelihood: 3.6249 Sigma2 Prior: -735.0612 Regularization: 0.0009
Iter: 18870 Training Loss: -720.6439
Negative Log Likelihood: 3.3821 Sigma2 Prior: -724.0269 Regularization: 0.0009
Iter: 18880 Training Loss: -728.7123
Negative Log Likelihood: 3.4139 Sigma2 Prior: -732.1272 Regularization: 0.0009
Iter: 18890 Training Loss: -718.4945
Negative Log Likelihood: 3.5948 Sigma2 Prior: -722.0902 Regularization: 0.0009
Iter: 18900 Training Loss: -738.0698
Negative Log Likelihood: 3.2242 Sigma2 Prior: -741.2949 Regularization: 0.0009
Iter: 18910 Training Loss: -715.5879
Negative Log Likelihood: 4.3727 Sigma2 Prior: -719.9615 Regularization: 0.0009
Iter: 18920 Training Loss: -724.5812
Negative Log Likelihood: 3.7560 Sigma2 Prior: -728.3382 Regularization: 0.0009
Iter: 18930 Training Loss: -736.6590
Negative Log Likelihood: 3.0381 Sigma2 Prior: -739.6981 Regularization: 0.0009
Iter: 18940 Training Loss: -724.1682
Negative Log Likelihood: 3.8337 Sigma2 Prior: -728.0027 Regularization: 0.0009
Iter: 18950 Training Loss: -716.3989
Negative Log Likelihood: 3.7087 Sigma2 Prior: -720.1085 Regularization: 0.0009
Iter: 18960 Training Loss: -735.7188
Negative Log Likelihood: 3.2965 Sigma2 Prior: -739.0162 Regularization: 0.0009
Iter: 18970 Training Loss: -724.4465
Negative Log Likelihood: 3.9145 Sigma2 Prior: -728.3619 Regularization: 0.0009
Iter: 18980 Training Loss: -732.5795
Negative Log Likelihood: 3.6391 Sigma2 Prior: -736.2195 Regularization: 0.0009
Iter: 18990 Training Loss: -723.2828
Negative Log Likelihood: 2.9152 Sigma2 Prior: -726.1989 Regularization: 0.0009
Iter: 19000 Training Loss: -718.0479
Negative Log Likelihood: 3.6505 Sigma2 Prior: -721.6993 Regularization: 0.0009
Iter: 19010 Training Loss: -718.0210
Negative Log Likelihood: 3.5526 Sigma2 Prior: -721.5745 Regularization: 0.0009
Iter: 19020 Training Loss: -732.0553
Negative Log Likelihood: 3.5711 Sigma2 Prior: -735.6273 Regularization: 0.0009
Iter: 19030 Training Loss: -725.2109
Negative Log Likelihood: 3.2720 Sigma2 Prior: -728.4838 Regularization: 0.0009
Iter: 19040 Training Loss: -711.1615
Negative Log Likelihood: 4.2128 Sigma2 Prior: -715.3752 Regularization: 0.0009
Iter: 19050 Training Loss: -723.4204
Negative Log Likelihood: 3.5522 Sigma2 Prior: -726.9736 Regularization: 0.0009
Iter: 19060 Training Loss: -716.4441
Negative Log Likelihood: 4.0505 Sigma2 Prior: -720.4955 Regularization: 0.0009
Iter: 19070 Training Loss: -720.5498
Negative Log Likelihood: 4.1448 Sigma2 Prior: -724.6956 Regularization: 0.0009
Iter: 19080 Training Loss: -729.1807
Negative Log Likelihood: 3.2209 Sigma2 Prior: -732.4025 Regularization: 0.0009
Iter: 19090 Training Loss: -733.9035
Negative Log Likelihood: 2.8876 Sigma2 Prior: -736.7920 Regularization: 0.0009
Iter: 19100 Training Loss: -724.0207
Negative Log Likelihood: 3.5670 Sigma2 Prior: -727.5886 Regularization: 0.0009
Iter: 19110 Training Loss: -731.9824
Negative Log Likelihood: 3.3941 Sigma2 Prior: -735.3774 Regularization: 0.0009
Iter: 19120 Training Loss: -722.0813
Negative Log Likelihood: 3.5082 Sigma2 Prior: -725.5905 Regularization: 0.0009
Iter: 19130 Training Loss: -721.3516
Negative Log Likelihood: 3.7437 Sigma2 Prior: -725.0962 Regularization: 0.0009
Iter: 19140 Training Loss: -727.6929
Negative Log Likelihood: 3.1762 Sigma2 Prior: -730.8700 Regularization: 0.0009
Iter: 19150 Training Loss: -724.1962
Negative Log Likelihood: 3.8586 Sigma2 Prior: -728.0557 Regularization: 0.0009
Iter: 19160 Training Loss: -726.2159
Negative Log Likelihood: 4.1562 Sigma2 Prior: -730.3731 Regularization: 0.0009
Iter: 19170 Training Loss: -724.8551
Negative Log Likelihood: 3.2995 Sigma2 Prior: -728.1555 Regularization: 0.0009
Iter: 19180 Training Loss: -726.8615
Negative Log Likelihood: 3.0402 Sigma2 Prior: -729.9026 Regularization: 0.0009
Iter: 19190 Training Loss: -721.2267
Negative Log Likelihood: 3.8729 Sigma2 Prior: -725.1005 Regularization: 0.0009
Iter: 19200 Training Loss: -728.1832
Negative Log Likelihood: 3.0765 Sigma2 Prior: -731.2606 Regularization: 0.0009
Iter: 19210 Training Loss: -722.7694
Negative Log Likelihood: 3.5572 Sigma2 Prior: -726.3275 Regularization: 0.0009
Iter: 19220 Training Loss: -738.0564
Negative Log Likelihood: 2.7489 Sigma2 Prior: -740.8062 Regularization: 0.0009
Iter: 19230 Training Loss: -727.4341
Negative Log Likelihood: 3.3829 Sigma2 Prior: -730.8179 Regularization: 0.0009
Iter: 19240 Training Loss: -731.0580
Negative Log Likelihood: 3.8103 Sigma2 Prior: -734.8693 Regularization: 0.0009
Iter: 19250 Training Loss: -723.8751
Negative Log Likelihood: 3.5716 Sigma2 Prior: -727.4476 Regularization: 0.0009
Iter: 19260 Training Loss: -720.5511
Negative Log Likelihood: 4.0518 Sigma2 Prior: -724.6039 Regularization: 0.0009
Iter: 19270 Training Loss: -726.2332
Negative Log Likelihood: 3.6454 Sigma2 Prior: -729.8795 Regularization: 0.0009
Iter: 19280 Training Loss: -728.4997
Negative Log Likelihood: 3.1381 Sigma2 Prior: -731.6387 Regularization: 0.0009
Iter: 19290 Training Loss: -739.8879
Negative Log Likelihood: 2.6957 Sigma2 Prior: -742.5845 Regularization: 0.0009
Iter: 19300 Training Loss: -736.6789
Negative Log Likelihood: 3.7705 Sigma2 Prior: -740.4503 Regularization: 0.0009
Iter: 19310 Training Loss: -727.7253
Negative Log Likelihood: 3.6100 Sigma2 Prior: -731.3362 Regularization: 0.0009
Iter: 19320 Training Loss: -715.2612
Negative Log Likelihood: 3.9619 Sigma2 Prior: -719.2240 Regularization: 0.0009
Iter: 19330 Training Loss: -731.0646
Negative Log Likelihood: 3.5442 Sigma2 Prior: -734.6097 Regularization: 0.0009
Iter: 19340 Training Loss: -736.9789
Negative Log Likelihood: 3.2188 Sigma2 Prior: -740.1987 Regularization: 0.0009
Iter: 19350 Training Loss: -723.4965
Negative Log Likelihood: 4.1973 Sigma2 Prior: -727.6947 Regularization: 0.0009
Iter: 19360 Training Loss: -706.8997
Negative Log Likelihood: 4.6910 Sigma2 Prior: -711.5916 Regularization: 0.0009
Iter: 19370 Training Loss: -721.6760
Negative Log Likelihood: 3.8923 Sigma2 Prior: -725.5693 Regularization: 0.0009
Iter: 19380 Training Loss: -725.4872
Negative Log Likelihood: 3.3343 Sigma2 Prior: -728.8224 Regularization: 0.0009
Iter: 19390 Training Loss: -732.0259
Negative Log Likelihood: 3.3257 Sigma2 Prior: -735.3526 Regularization: 0.0009
Iter: 19400 Training Loss: -724.0025
Negative Log Likelihood: 3.3673 Sigma2 Prior: -727.3707 Regularization: 0.0009
Iter: 19410 Training Loss: -730.3519
Negative Log Likelihood: 3.4988 Sigma2 Prior: -733.8516 Regularization: 0.0009
Iter: 19420 Training Loss: -727.2773
Negative Log Likelihood: 3.3995 Sigma2 Prior: -730.6777 Regularization: 0.0009
Iter: 19430 Training Loss: -722.2666
Negative Log Likelihood: 4.1232 Sigma2 Prior: -726.3907 Regularization: 0.0009
Iter: 19440 Training Loss: -722.8105
Negative Log Likelihood: 3.7380 Sigma2 Prior: -726.5494 Regularization: 0.0009
Iter: 19450 Training Loss: -725.7615
Negative Log Likelihood: 3.5917 Sigma2 Prior: -729.3541 Regularization: 0.0009
Iter: 19460 Training Loss: -722.9622
Negative Log Likelihood: 3.4262 Sigma2 Prior: -726.3893 Regularization: 0.0009
Iter: 19470 Training Loss: -725.1686
Negative Log Likelihood: 3.2511 Sigma2 Prior: -728.4207 Regularization: 0.0009
Iter: 19480 Training Loss: -718.3544
Negative Log Likelihood: 4.3529 Sigma2 Prior: -722.7083 Regularization: 0.0009
Iter: 19490 Training Loss: -720.4564
Negative Log Likelihood: 3.4745 Sigma2 Prior: -723.9318 Regularization: 0.0009
Iter: 19500 Training Loss: -724.8195
Negative Log Likelihood: 3.5798 Sigma2 Prior: -728.4002 Regularization: 0.0009
Iter: 19510 Training Loss: -723.1507
Negative Log Likelihood: 3.3282 Sigma2 Prior: -726.4799 Regularization: 0.0009
Iter: 19520 Training Loss: -710.6179
Negative Log Likelihood: 3.4111 Sigma2 Prior: -714.0298 Regularization: 0.0009
Iter: 19530 Training Loss: -732.1906
Negative Log Likelihood: 2.6843 Sigma2 Prior: -734.8757 Regularization: 0.0009
Iter: 19540 Training Loss: -725.1456
Negative Log Likelihood: 2.8385 Sigma2 Prior: -727.9849 Regularization: 0.0009
Iter: 19550 Training Loss: -724.5634
Negative Log Likelihood: 3.8259 Sigma2 Prior: -728.3902 Regularization: 0.0009
Iter: 19560 Training Loss: -725.5397
Negative Log Likelihood: 4.0222 Sigma2 Prior: -729.5629 Regularization: 0.0009
Iter: 19570 Training Loss: -727.6682
Negative Log Likelihood: 3.1170 Sigma2 Prior: -730.7860 Regularization: 0.0009
Iter: 19580 Training Loss: -720.1841
Negative Log Likelihood: 3.1913 Sigma2 Prior: -723.3763 Regularization: 0.0009
Iter: 19590 Training Loss: -725.9839
Negative Log Likelihood: 3.2720 Sigma2 Prior: -729.2568 Regularization: 0.0009
Iter: 19600 Training Loss: -719.4259
Negative Log Likelihood: 3.6081 Sigma2 Prior: -723.0349 Regularization: 0.0009
Iter: 19610 Training Loss: -718.7526
Negative Log Likelihood: 3.5830 Sigma2 Prior: -722.3365 Regularization: 0.0009
Iter: 19620 Training Loss: -725.8557
Negative Log Likelihood: 3.0600 Sigma2 Prior: -728.9166 Regularization: 0.0009
Iter: 19630 Training Loss: -723.4412
Negative Log Likelihood: 3.5827 Sigma2 Prior: -727.0248 Regularization: 0.0009
Iter: 19640 Training Loss: -732.2924
Negative Log Likelihood: 3.3664 Sigma2 Prior: -735.6597 Regularization: 0.0009
Iter: 19650 Training Loss: -722.6198
Negative Log Likelihood: 3.3781 Sigma2 Prior: -725.9988 Regularization: 0.0009
Iter: 19660 Training Loss: -721.0581
Negative Log Likelihood: 3.4956 Sigma2 Prior: -724.5547 Regularization: 0.0009
Iter: 19670 Training Loss: -727.7924
Negative Log Likelihood: 3.3003 Sigma2 Prior: -731.0936 Regularization: 0.0009
Iter: 19680 Training Loss: -727.6862
Negative Log Likelihood: 3.6171 Sigma2 Prior: -731.3043 Regularization: 0.0009
Iter: 19690 Training Loss: -724.1562
Negative Log Likelihood: 3.5555 Sigma2 Prior: -727.7126 Regularization: 0.0009
Iter: 19700 Training Loss: -722.2432
Negative Log Likelihood: 4.3937 Sigma2 Prior: -726.6378 Regularization: 0.0009
Iter: 19710 Training Loss: -720.6392
Negative Log Likelihood: 3.3146 Sigma2 Prior: -723.9548 Regularization: 0.0009
Iter: 19720 Training Loss: -728.5101
Negative Log Likelihood: 3.4281 Sigma2 Prior: -731.9392 Regularization: 0.0009
Iter: 19730 Training Loss: -726.2839
Negative Log Likelihood: 3.1236 Sigma2 Prior: -729.4084 Regularization: 0.0009
Iter: 19740 Training Loss: -728.2184
Negative Log Likelihood: 3.9936 Sigma2 Prior: -732.2130 Regularization: 0.0009
Iter: 19750 Training Loss: -727.6571
Negative Log Likelihood: 3.4502 Sigma2 Prior: -731.1083 Regularization: 0.0009
Iter: 19760 Training Loss: -729.0967
Negative Log Likelihood: 3.5021 Sigma2 Prior: -732.5997 Regularization: 0.0009
Iter: 19770 Training Loss: -719.5427
Negative Log Likelihood: 3.5715 Sigma2 Prior: -723.1152 Regularization: 0.0009
Iter: 19780 Training Loss: -719.2875
Negative Log Likelihood: 4.0046 Sigma2 Prior: -723.2931 Regularization: 0.0009
Iter: 19790 Training Loss: -723.3829
Negative Log Likelihood: 3.2781 Sigma2 Prior: -726.6619 Regularization: 0.0009
Iter: 19800 Training Loss: -725.5870
Negative Log Likelihood: 4.1253 Sigma2 Prior: -729.7132 Regularization: 0.0009
Iter: 19810 Training Loss: -719.1433
Negative Log Likelihood: 4.1278 Sigma2 Prior: -723.2720 Regularization: 0.0009
Iter: 19820 Training Loss: -729.5969
Negative Log Likelihood: 3.5349 Sigma2 Prior: -733.1328 Regularization: 0.0009
Iter: 19830 Training Loss: -728.7231
Negative Log Likelihood: 3.0107 Sigma2 Prior: -731.7347 Regularization: 0.0009
Iter: 19840 Training Loss: -725.7592
Negative Log Likelihood: 3.1520 Sigma2 Prior: -728.9122 Regularization: 0.0009
Iter: 19850 Training Loss: -730.2125
Negative Log Likelihood: 3.5164 Sigma2 Prior: -733.7298 Regularization: 0.0009
Iter: 19860 Training Loss: -721.7269
Negative Log Likelihood: 3.6223 Sigma2 Prior: -725.3501 Regularization: 0.0009
Iter: 19870 Training Loss: -728.2869
Negative Log Likelihood: 3.5931 Sigma2 Prior: -731.8809 Regularization: 0.0009
Iter: 19880 Training Loss: -712.1382
Negative Log Likelihood: 3.7918 Sigma2 Prior: -715.9309 Regularization: 0.0009
Iter: 19890 Training Loss: -727.8840
Negative Log Likelihood: 3.5483 Sigma2 Prior: -731.4332 Regularization: 0.0009
Iter: 19900 Training Loss: -727.7476
Negative Log Likelihood: 3.2423 Sigma2 Prior: -730.9908 Regularization: 0.0009
Iter: 19910 Training Loss: -730.3909
Negative Log Likelihood: 3.6628 Sigma2 Prior: -734.0547 Regularization: 0.0009
Iter: 19920 Training Loss: -721.8326
Negative Log Likelihood: 3.4347 Sigma2 Prior: -725.2682 Regularization: 0.0009
Iter: 19930 Training Loss: -732.1597
Negative Log Likelihood: 3.3994 Sigma2 Prior: -735.5600 Regularization: 0.0009
Iter: 19940 Training Loss: -726.5756
Negative Log Likelihood: 3.9425 Sigma2 Prior: -730.5190 Regularization: 0.0009
Iter: 19950 Training Loss: -724.1501
Negative Log Likelihood: 3.9731 Sigma2 Prior: -728.1241 Regularization: 0.0009
Iter: 19960 Training Loss: -719.7261
Negative Log Likelihood: 3.7234 Sigma2 Prior: -723.4504 Regularization: 0.0009
Iter: 19970 Training Loss: -720.8751
Negative Log Likelihood: 4.1461 Sigma2 Prior: -725.0220 Regularization: 0.0009
Iter: 19980 Training Loss: -722.9457
Negative Log Likelihood: 3.7966 Sigma2 Prior: -726.7432 Regularization: 0.0009
Iter: 19990 Training Loss: -728.5401
Negative Log Likelihood: 3.0123 Sigma2 Prior: -731.5533 Regularization: 0.0009
Iter: 19999 Training Loss: -731.6766
Negative Log Likelihood: 2.5480 Sigma2 Prior: -734.2255 Regularization: 0.0009
Done training with 20000 iterations
Traceback (most recent call last):
File "/home/rice/PycharmProjects/uis-rnn-master/demo.py", line 85, in
main()
File "/home/rice/PycharmProjects/uis-rnn-master/demo.py", line 81, in main
diarization_experiment(model_args, training_args, inference_args)
File "/home/rice/PycharmProjects/uis-rnn-master/demo.py", line 61, in diarization_experiment
predicted_label = model.predict(test_sequence, inference_args)
File "/home/rice/PycharmProjects/uis-rnn-master/uisrnn/uisrnn.py", line 573, in predict
return self.predict_single(test_sequences, args)
File "/home/rice/PycharmProjects/uis-rnn-master/uisrnn/uisrnn.py", line 503, in predict_single
raise ValueError('test_sequence must be 2-dim array.')
ValueError: test_sequence must be 2-dim array.

Process finished with exit code 1

can you help me slove this problem?

N = 1 --> EER : 0.00 (thres:0.00, FAR:0.00, FRR:0.00) for enrollment and verification.???

Number of speakers in batch N = 1 then,
EER : 0.00 (thres:0.00, FAR:0.00, FRR:0.00)

torch.Size([1, 4, 160, 40]) mel_db_batch.size()
4 mel_db_batch.size(1)
0 batch_id
torch.Size([1, 2, 160, 40]) enrollment_batch size
torch.Size([1, 2, 160, 40]) verification_batch size
torch.Size([2, 160, 40]) reshape tensor enrollment_batch size
torch.Size([2, 160, 40]) reshape tensor verification_batch size
2 reshape tensor verification_batch size(0)
[0, 1] perm
[0, 1] unperm
torch.Size([2, 160, 40]) verification_batch
torch.Size([2, 256]) enrollment_embeddings
torch.Size([2, 256]) verification_embeddings
torch.Size([2, 256]) verification_embeddings
torch.Size([1, 2, 256]) enrollment_embeddings.size()
torch.Size([1, 2, 256]) verification_embeddings.size()
torch.Size([1, 256]) enrollment_centroids.size()
tensor([[[0.7973],
         [0.7973]]], grad_fn=<CopySlices>) sim_matrix
torch.Size([1, 2, 1]) sim_matrix.size()

EER : 0.00 (thres:0.00, FAR:0.00, FRR:0.00)

EER across 10 epochs: 0.0000

if N > 1 means,
EER : 0.01 (thres:0.60, FAR:0.03, FRR:0.00)

EER : 0.00 (thres:0.71, FAR:0.00, FRR:0.00)
torch.Size([4, 6, 160, 40]) mel_db_batch.size()
6 mel_db_batch.size(1)
11 batch_id
torch.Size([4, 3, 160, 40]) enrollment_batch size
torch.Size([4, 3, 160, 40]) verification_batch size
torch.Size([12, 160, 40]) reshape tensor enrollment_batch size
torch.Size([12, 160, 40]) reshape tensor verification_batch size
12 reshape tensor verification_batch size(0)
[2, 6, 5, 4, 7, 8, 9, 1, 0, 10, 3, 11] perm
[8, 7, 0, 10, 3, 2, 1, 4, 5, 6, 9, 11] unperm
torch.Size([12, 160, 40]) verification_batch
torch.Size([12, 256]) enrollment_embeddings
torch.Size([12, 256]) verification_embeddings
torch.Size([12, 256]) verification_embeddings
torch.Size([4, 3, 256]) enrollment_embeddings.size()
torch.Size([4, 3, 256]) verification_embeddings.size()
torch.Size([4, 256]) enrollment_centroids.size()

torch.Size([4, 3, 4]) sim_matrix.size()

EER : 0.01 (thres:0.60, FAR:0.03, FRR:0.00)

How can i fit best threshold for single speaker enrollment vs inference in testing.

@HarryVolek Thanks sir.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.