Suppose there are 8 features in the dataset. I use PCA to and find out that 99% of the information is in the first 3 features using the cumulative sum of the explained variance ratio. Then why do I need to fit and transform these 3 features using PCA in order to use them for training my neural network ? Why cant I just use the three features as is ?
Using features without applying PCA
89 Views Asked by Tanmay Bhatnagar At
1
There are 1 best solutions below
Related Questions in MACHINE-LEARNING
- Trained ML model with the camera module is not giving predictions
- Keras similarity calculation. Enumerating distance between two tensors, which indicates as lists
- How to get content of BLOCK types LAYOUT_TITLE, LAYOUT_SECTION_HEADER and LAYOUT_xx in Textract
- How to predict input parameters from target parameter in a machine learning model?
- The training accuracy and the validation accuracy curves are almost parallel to each other. Is the model overfitting?
- ImportError: cannot import name 'HuggingFaceInferenceAPI' from 'llama_index.llms' (unknown location)
- Which library can replace causal_conv1d in machine learning programming?
- Fine-Tuning Large Language Model on PDFs containing Text and Images
- Sketch Guided Text to Image Generation
- My ICNN doesn't seem to work for any n_hidden
- Optuna Hyperband Algorithm Not Following Expected Model Training Scheme
- How can I resolve this error and work smoothly in deep learning?
- ModuleNotFoundError: No module named 'llama_index.node_parser'
- Difference between model.evaluate and metrics.accuracy_score
- Give Bert an input and ask him to predict. In this input, can Bert apply the first word prediction result to all subsequent predictions?
Related Questions in NEURAL-NETWORK
- Influence of Unused FFN on Model Accuracy in PyTorch
- How to train a model with CSV files of multiple patients?
- Does tensorflow have a way of calculating input importance for simple neural networks
- My ICNN doesn't seem to work for any n_hidden
- a problem for save and load a pytorch model
- config QConfig in pytorch QAT
- How can I convert a flax.linen.Module to a torch.nn.Module?
- Spiking neural network on FPGA
- Error while loading .keras model: Layer node index out of bounds
- Matrix multiplication issue in a Bidirectional LSTM Model
- Recommended way to use Gymnasium with neural networks to avoid overheads in model.fit and model.predict
- Loss is not changing. Its remaining constant
- Relationship Between Neural Network Distances and Performance
- Mapping a higher dimension tensor into a lower one: (B, F, D) -> (B, F-n, D) in PyTorch
- jax: How do we solve the error: pmap was requested to map its argument along axis 0, which implies that its rank should be at least 1, but is only 0?
Related Questions in PCA
- How to use model.predict after PCA transformation in Python?
- Fix scatter plot in PCA
- R PCA: Why are some points in a biplot larger than others?
- Can you specify a rotation in h2o's PCA function?
- Principal Component Analysis and Clustering - Better Discrimination between Classes
- Covariance estimation using the Factor Model
- split sklearn's IncrementalPCA (ipca.transform) input, based on features instead of samples
- How to change the colors on fviz_contrib to match clusters/groups in R?
- My Principal components plotted using sklearn seems a bit rotated by some degrees. What have I missed?
- How to reshape the result of sklearn's IncrementalPCA (ipca.transform)
- Can the PC1 results from a PCA test be multiplied to the corresponding variable and added together to create a new variable column?
- Correlation matrix shrinkage causes matrix multiplication error for monte carlo simulation
- Searching for some individuals in a biplot in R
- TypeError: 'int' object is not iterable" and PCA Assertion Error in Python Clustering Function
- PCA with BioClim Rasters - R language
Related Questions in CROSS-VALIDATION
- Get fitted estimator from CV function of XGBoost
- Solving R fatal error when using loo with GAM
- How to adapt lgb.cv in my k-folds splitting way?
- Why do I get the same results with different cross-validation specifications in caret for `lm`
- fbprophet how to adapt the date of my data to the prediction date and cross-validation
- Cross validation and/or train_test_split in scikit-learn?
- How to weight samples with sklearns's cross_validate for scoring only?
- Argument of length 0" during cross-validation in R
- ValueError: could not convert string to float: 'Curtis RIngraham Directge'
- Time Series Cross Validation Warning (tidymodels, fit_resamples)
- Problems with building a custom cv splitter for sklearn
- TypeError: Singleton array in nested stratified cross-validation
- Is there a proper way to apply median imputation by groups in caret?
- Key Error when Implementing Cross Validation with GroupKFold
- fabletools: Using forecast function on stretching window using external regressors
Related Questions in DIMENSION-REDUCTION
- goodness of fit of umap
- numpy convert N-D array to a list of (N-1)D array without loop
- I want to input 3d array(custom data) to sklearn-PCA function
- The lstm autoencoder does not use the full dimensions of the latent space for dimension reduction
- Why is it ok to remove variables with low variance from a dataset
- Global operator along a single dimension in Keras?
- Optimal perplexity for t-SNE with using larger datasets (>300k data points)
- How to compute/extract the residual variance from an Isomap [vegan] model in R
- Mapping a numerical function with two inputs onto one with one input
- Looking for a function in R to sum rows and cols for matrix reduction
- Why tsne method use Euclidean distance to compute the similarities in high dimensional data?
- How to use a function that changes during training with keras
- Confirmatory Factor Analysis in Python
- Dimensional reduction through subspace clustering
- The Curse of high Dimension And Distance
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular # Hahtags
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
The reason is that when PCA tells you that 99% of the variance is explained by the first three components, it doesn't mean that it is explained by the first three features. PCA components are linear combinations of the features, but they are usually not the features themselves. For example, PCA components must be orthogonal to each other, while the features don't have to be.