petteriteikari / r-plr Goto Github PK
View Code? Open in Web Editor NEWOnce separate repos combined to one making it easier in the end
Once separate repos combined to one making it easier in the end
Error in pathology.lookup.table(group_name_in) :
object 'group_name_out' not found
Not all possible Diganosis codes are not now defined in pathology.lookup.table()
function
if (identical(group_name_in,'POAG') |
identical(group_name_in,'NTG') |
identical(group_name_in,'DISC SUSPECT')) {
group_name_out = 'Glaucoma'
In other words, if you had PACG in your Excel data sheet, in the above example, the script now does not know what "Master label" does it have
.. Removed total of " 92 " rows from Control Excel sheet
.. Removed total of " 58 " rows from Glaucoma Excel sheet
.. Removed total of " 78 " rows from Diabetes Excel sheet
.. Removed total of " 96 " rows from Exceptional Cases (Neuro) Excel sheet
Error in rbind(deparse.level, ...) :
numbers of columns of arguments do not match
With "Ctrl+Alt+Enter" you do not get even exact line reference to your problem, but it means that you do not have the same columns in your Master Data sheets.
The traceback goes here:
6. stop("numbers of columns of arguments do not match")
5. rbind(deparse.level, ...)
4. rbind(comb1, df_diabetes2) at combine_excelDataFramesToOne.R#28
3. combine.excelDataFramesToOne(df_control, df_glaucoma, df_diabetes,
df_neuro, vars_to_keep) at read_theMasterExcel.R#40
2. read.theMasterExcel(masterXLS_data_path, XLS_filename) at import_computedFeats.R#9
1. import.computedFeats(data_path_feats, pattern_to_find, dataset_type,
masterXLS_data_path, XLS_filename)
rbind
joins the rows together, and expects there to be the same number
Added a warning part to highlight the problematic Sheets:
7485ecc
Your sheets are not the same length! Fix your Master Data Sheet!
no of columns in Control sheet = 63
no of columns in Glaucoma sheet = 63
no of columns in Diabetes sheet = 62
no of columns in Neuro sheet = 62
library(rstudioapi)
Error in library(rstudioapi) : there is no package called ‘rstudioapi’
"Recon" Folders are being created outside the TEST_OUT folder.
When running Server to inspect manually :
warning('Petteri: Inspect the imputation now manually, \n
or change the path from next block if you do not want to do it \n
Eyeball the details from the GITHUB WIKI if this does not make sense to you')
Warning in file(file, "rt") :
cannot open file 'C:/Users/User/Desktop/RPLR/TEST_OUT/imputation_final/NA': No such file or directory
Warning: Error in file: cannot open the connection
50: file
49: read.table
48: read.csv
47: server [C:\Users\User\Desktop\RPLR\R-PLR\Apps_Shiny\inspect_outliers/server.R#105]
Error in file(file, "rt") : cannot open the connection
DATA IN: C:/Users/User/Desktop/RPLR/TEST_IN/recon_EMD
Ray: should be TEST_OUT ?
Error given when trying to manually inspect the EMD
> runApp('Apps_Shiny/inspect_EMD')
Listening on http://127.0.0.1:7682
--- just_the_file = server.R
--- --- full_path_script = C:/Users/User/Desktop/RPLR/R-PLR/Apps_Shiny/inspect_EMD/server.R
DATA IN: C:/Users/User/Desktop/RPLR/TEST_IN/recon_EMD
DATA OUT: C:/Users/User/Desktop/RPLR/TEST_IN/recon_EMD/IMF_fusion
... moving done files to: C:/Users/User/Desktop/RPLR/TEST_IN/recon_EMD/DONE
Warning in server(...) :
No input files were found from DATA IN = "C:/Users/User/Desktop/RPLR/TEST_IN/recon_EMD"
.... There are no done files from your "check path" = C:/Users/User/Desktop/RPLR/TEST_IN/recon_EMD/IMF_fusion
-> in other we assume now that you have not yet processed any of the input files
EXPLANATION #2: Why we are checking from "Reconstructed path"? As that is the end point of this script
Warning in check.for.done.filecodes(files_fullpath, path_out) :
There are no done files from your "input path"
-> now we cannot processing anything now!!!
.. found 0 unprocessed input files
Input file: NA
Warning in server(...) :
Well we have no input filename to open as no files were found from input path
Warning in file(file, "rt") :
cannot open file 'NA': No such file or directory
Warning: Error in file: cannot open the connection
50: file
49: read.table
48: read.csv
47: server [C:\Users\User\Desktop\RPLR\R-PLR\Apps_Shiny\inspect_EMD/server.R#95]
Error in file(file, "rt") : cannot open the connection
Error in library(pracma) : there is no package called ‘pracma’
when calling
# Finally compute the hand-crafted features here
batch.PLR.analyze.reconstructions(data_path = paths[['data_in']][['features']],
data_path_out = paths[['data_out']][['features']],
RPLR_analysis_path = paths[['analysis']],
parameters, RPLR_paths, masterExcel,
process_only_unprocessed = TRUE,
path_check_for_done = paths[['data_out']][['features']],
no_of_cores_to_use = detectCores(),
pupil_col = 'pupil')
}
This comes out after calling import.and.install.libraries(paths)
_Checking the LIBRARIESCreating the directory for DATA Recon outputCreating the directory for DATA Imputed outputCreating the directory for DATA Trimmed outputCreating the directory for DATA Recon EMD output
Warning messages:
1: In dir.create(data_path_out, showWarnings = TRUE, recursive = FALSE, :
cannot create dir 'C:\Users\Ray-Najjar\Desktop\GitPLR\R-PLR\..\TEST_OUT\outlier_free\outlier_free_corrected\..\recon', reason 'No such file or directory'
2: In dir.create(data_resampled_path_out, showWarnings = TRUE, recursive = FALSE, :
cannot create dir 'C:\Users\Ray-Najjar\Desktop\GitPLR\R-PLR\..\TEST_OUT\outlier_free\outlier_free_corrected\..\recon_resampled', reason 'No such file or directory'
3: In dir.create(data_trimmed_path_out, showWarnings = TRUE, recursive = FALSE, :
cannot create dir 'C:\Users\Ray-Najjar\Desktop\GitPLR\R-PLR\..\TEST_OUT\outlier_free\outlier_free_corrected\..\recon_trimmed', reason 'No such file or directory'
4: In dir.create(data_temp_path_out, showWarnings = TRUE, recursive = FALSE, :
cannot create dir 'C:\Users\Ray-Najjar\Desktop\GitPLR\R-PLR\..\TEST_OUT\outlier_free\outlier_free_corrected\..\recon_EMD', reason 'No such file or directory'_
[Petteri edit], see @ray-najjar the Markdown Cheatsheet
to adding the triple apostrophes with the language pasted
The forecast
package was installed with CUDA 8.xx, and does not get properly updated to 9.xx for some reason even with nstall.packages("forecast")
Error: package or namespace load failed for ‘forecast’ in dyn.load(file, DLLpath = DLLpath, ...):
unable to load shared object '/home/petteri/R/x86_64-pc-linux-gnu-library/3.4/uroot/libs/uroot.so':
libcudart.so.8.0: cannot open shared object file: No such file or directory
= C:/Users/User/Desktop/RPLR/TEST_OUT/PLR_feat
-> in other we assume now that you have not yet processed any of the input files
EXPLANATION #2: Why we are checking from "Reconstructed path"? As that is the end point of this script
.. found 3 unprocessed input files
Analyzing file = PLR2076_reconstruction.csv
Traditional time domain features: BLUE RED
Fractal features
Time-Frequency features
[1] "IMF 1 COMPLETE!"
[1] "IMF 2 COMPLETE!"
[1] "IMF 3 COMPLETE!"
[1] "IMF 4 COMPLETE!"
[1] "IMF 5 COMPLETE!"
[1] "IMF 6 COMPLETE!"
[1] "IMF 7 COMPLETE!"
[1] "IMF 8 COMPLETE!"
[1] "IMF 9 COMPLETE!"
Error: Unknown graphics device ''
when calling
# Finally compute the hand-crafted features here
batch.PLR.analyze.reconstructions(data_path = paths[['data_in']][['features']],
data_path_out = paths[['data_out']][['features']],
RPLR_analysis_path = paths[['analysis']],
parameters, RPLR_paths, masterExcel,
process_only_unprocessed = TRUE,
path_check_for_done = paths[['data_out']][['features']],
no_of_cores_to_use = detectCores(),
pupil_col = 'pupil')
}
"httpuv" needs to be installed manually
httpuv_1.4.5.zip
"later" needs to be v.0.7.3 for "shiny" package to load it.
"later" package requires installing R v 3.5.1
R can be updated using the intallr package in windows:
install.packages("installr"); library(installr) # install+load installr
updateR() # updating R.
once R version 3.5.1 is installed then update "later" package.
then install "shiny"
Error in mclapply(files_to_process, function(files_to_process) { :
'mc.cores' > 1 is not supported on Windows
when calling
# Do some semi-intelligent decompositions for machine learning data augmentation purposes
# Computes as well 1st and 2nd derivatives (i.e. velocity and acceleration) from the smoothed PLRs
batch.data.decompose.for.augmentation(data_path = paths[['data_out']][['reconstructed']],
data_path_out = paths[['data_out']][['FinalOUT']],
RPLR_recon_path = paths[['recon']],
parameters = param[['decomp_augm']],
RPLR_paths = paths[['RPLR']],
masterExcel = paths[['data_in']][['excelMasterPath']],
process_only_unprocessed = TRUE,
path_check_for_done = paths[['data_out']][['FinalOUT']],
pupil_col = 'denoised')
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.