You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "main.py", line 83, in
main()
File "main.py", line 30, in main
random_seed=opts.random_seed)
File "/storage1/lilab/student/ynwang/meta_data/batch_effact/knn_1499features/NormAE/datasets.py", line 136, in get_metabolic_data
meta_df, y_df = pre_transfer(meta_df, y_df)
File "/storage1/lilab/student/ynwang/meta_data/batch_effact/knn_1499features/NormAE/transfer.py", line 24, in call
x = self.scaler.fit_transform(values)
File "/home/ynwang/miniconda3/envs/python3.7/lib/python3.7/site-packages/sklearn/base.py", line 852, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/home/ynwang/miniconda3/envs/python3.7/lib/python3.7/site-packages/sklearn/preprocessing/_data.py", line 806, in fit
return self.partial_fit(X, y, sample_weight)
File "/home/ynwang/miniconda3/envs/python3.7/lib/python3.7/site-packages/sklearn/preprocessing/_data.py", line 847, in partial_fit
reset=first_call,
File "/home/ynwang/miniconda3/envs/python3.7/lib/python3.7/site-packages/sklearn/base.py", line 566, in _validate_data
X = check_array(X, **check_params)
File "/home/ynwang/miniconda3/envs/python3.7/lib/python3.7/site-packages/sklearn/utils/validation.py", line 817, in check_array
% (n_features, array.shape, ensure_min_features, context)
ValueError: Found array with 0 feature(s) (shape=(598, 0)) while a minimum of 1 is required by StandardScaler.
The text was updated successfully, but these errors were encountered:
I received the same error without QC samples. I narrow down the source of this error to lines 94-97 in datasets.py. Unfortunately, simply commenting these lines leads to new errors because later visual.py expect qc_pca. It appears that in the current form tool can't be used without QC samples.
@luyiyun could you please update the NormAE, so it can be used without QC samples?
I got the error:
the settings of training:
task: train
meta_data: ./allearly_MSRT_knn_log.csv
sample_data: ./batch1_6_alldata_order_batch_noqc.csv
train_data: all
save: ./result_noqc
ae_encoder_units: [1000, 1000]
ae_decoder_units: [1000, 1000]
disc_b_units: [250, 250]
disc_o_units: [250, 250]
bottle_num: 500
dropouts: (0.3, 0.1, 0.3, 0.3)
lambda_b: 1.0
lambda_o: 1.0
lr_rec: 0.0002
lr_disc_b: 0.005
lr_disc_o: 0.0005
epoch: (1000, 10, 700)
use_batch_for_order: True
batch_size: 64
load: None
visdom_env: main
visdom_port: 8097
num_workers: 12
use_log: False
use_batch: None
sample_size: None
random_seed: 1234
device: None
Traceback (most recent call last):
File "main.py", line 83, in
main()
File "main.py", line 30, in main
random_seed=opts.random_seed)
File "/storage1/lilab/student/ynwang/meta_data/batch_effact/knn_1499features/NormAE/datasets.py", line 136, in get_metabolic_data
meta_df, y_df = pre_transfer(meta_df, y_df)
File "/storage1/lilab/student/ynwang/meta_data/batch_effact/knn_1499features/NormAE/transfer.py", line 24, in call
x = self.scaler.fit_transform(values)
File "/home/ynwang/miniconda3/envs/python3.7/lib/python3.7/site-packages/sklearn/base.py", line 852, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/home/ynwang/miniconda3/envs/python3.7/lib/python3.7/site-packages/sklearn/preprocessing/_data.py", line 806, in fit
return self.partial_fit(X, y, sample_weight)
File "/home/ynwang/miniconda3/envs/python3.7/lib/python3.7/site-packages/sklearn/preprocessing/_data.py", line 847, in partial_fit
reset=first_call,
File "/home/ynwang/miniconda3/envs/python3.7/lib/python3.7/site-packages/sklearn/base.py", line 566, in _validate_data
X = check_array(X, **check_params)
File "/home/ynwang/miniconda3/envs/python3.7/lib/python3.7/site-packages/sklearn/utils/validation.py", line 817, in check_array
% (n_features, array.shape, ensure_min_features, context)
ValueError: Found array with 0 feature(s) (shape=(598, 0)) while a minimum of 1 is required by StandardScaler.
The text was updated successfully, but these errors were encountered: