-
Updated
Aug 28, 2018 - Swift
#
lasso-regression
Here are 207 public repositories matching this topic...
A simple machine learning framework written in Swift 🤖
swift
machine-learning
neural-network
genetic-algorithm
linear-regression
machine-learning-algorithms
regression
artificial-intelligence
machine-learning-library
feedforward-neural-network
kmeans
ridge-regression
polynomial-regression
backpropagation
kmeans-clustering
mlkit
lasso-regression
Code that might be useful to others for learning/demonstration purposes, specifically along the lines of modeling and various algorithms.
python
r
julia
zip
matlab
irt
pca
survival-analysis
bayesian
stan
em
mixture-model
factor-analysis
gaussian-processes
jags
mixed-models
additive-models
lasso-regression
ordinal-regression
probit
-
Updated
Apr 19, 2020 - R
Predicting Amsterdam house / real estate prices using Ordinary Least Squares-, XGBoost-, KNN-, Lasso-, Ridge-, Polynomial-, Random Forest-, and Neural Network MLP Regression (via scikit-learn)
real-estate
python
machine-learning
neural-network
random-forest
lasso
xgboost
polynomial
ensemble-learning
ols
decision-trees
ridge-regression
polynomial-regression
knn
multi-layer-perceptron
amsterdam
predicting-housing-prices
lasso-regression
mlp-regressor
knn-regression
-
Updated
Apr 9, 2019 - Python
Machine learning algorithms with dart
dart
classifier
data-science
machine-learning
algorithm
linear-regression
machine-learning-algorithms
regression
hyperparameters
sgd
logistic-regression
softmax-regression
dartlang
stochastic-gradient-descent
softmax
lasso-regression
batch-gradient-descent
mini-batch-gradient-descent
softmax-classifier
softmax-algorithm
-
Updated
Jun 27, 2020 - Dart
The sample question for Interview a job in Binary options
statistics
betting
bayesian
investment
arima
ridge-regression
mcmc
acd
garch
elastic-net
ets
lasso-regression
investment-strategies
kelly-criterion
investment-portfolio
markowitz-portfolio
-
Updated
Nov 3, 2018
Implemented ADMM for solving convex optimization problems such as Lasso, Ridge regression
-
Updated
Nov 13, 2018 - Jupyter Notebook
MATLAB library of gradient descent algorithms for sparse modeling: Version 1.0.3
machine-learning
big-data
algorithms
optimization
machine-learning-algorithms
solver
lasso
logistic-regression
gradient-descent
support-vector-machines
admm
proximal-algorithms
proximal-operators
sparse-regression
optimization-algorithms
matrix-completion
elasticnet
lasso-regression
coordinate-descent
sparse-linear-solver
-
Updated
Nov 20, 2018 - MATLAB
Automated Essay Scoring on The Hewlett Foundation dataset on Kaggle
machine-learning
natural-language-processing
linear-regression
sklearn
nltk
ensemble-learning
text-processing
text-analytics
ridge-regression
cohens-kappa
text-regression
lasso-regression
support-vector-regression
gradient-boosting-regressor
automatic-essay-scoring
-
Updated
Apr 26, 2018 - Jupyter Notebook
-
Updated
Jun 25, 2020 - R
Open
Documentation
4
TwitPersonality: Computing Personality Traits from Tweets using Word Embeddings and Supervised Learning
python
natural-language-processing
twitter
svm
regression
tweepy
personality-traits
lasso-regression
wordembeddings
big5
-
Updated
Jan 31, 2019 - Python
Python notebooks for my graduate class on Detection, Estimation, and Learning. Intended for in-class demonstration. Notebooks illustrate a variety of concepts, from hypothesis testing to estimation to image denoising to Kalman filtering. Feel free to use or modify for your instruction or self-study.
python
machine-learning
signal-processing
detection
jupyter-notebook
regression
estimation
lasso
ridge-regression
hypothesis-testing
maximum-likelihood
teaching-materials
kalman-filter
python-notebook
lasso-regression
estimation-theory
tikhonov-regularization
-
Updated
Apr 23, 2018 - Jupyter Notebook
Understand the relationships between various features in relation with the sale price of a house using exploratory data analysis and statistical analysis. Applied ML algorithms such as Multiple Linear Regression, Ridge Regression and Lasso Regression in combination with cross validation. Performed parameter tuning, compared the test scores and suggested a best model to predict the final sale price of a house. Seaborn is used to plot graphs and scikit learn package is used for statistical analysis.
python
machine-learning
correlation
linear-regression
cross-validation
data-visualization
data-extraction
data-analysis
regularization
standardization
datawrangling
predictive-modeling
ridge-regression
data-exploration
k-fold
lasso-regression
encoding-library
parameter-tuning
root-mean-squared-error-metric
regression-analysis
-
Updated
Jan 19, 2018 - Jupyter Notebook
Introduction The context is the 2016 public use NH medical claims files obtained from NH CHIS (Comprehensive Health Care Information System). The dataset contains Commercial Insurance claims, and a small fraction of Medicaid and Medicare payments for dually eligible people. The primary purpose of this assignment is to test machine learning (ML) skills in a real case analysis setting. You are expected to clean and process data and then apply various ML techniques like Linear and no linear models like regularized regression, MARS, and Partitioning methods. You are expected to use at least two of R, Python and JMP software. Data details: Medical claims file for 2016 contains ~17 millions rows and ~60 columns of data, containing ~6.5 million individual medical claims. These claims are all commercial claims that were filed by healthcare providers in 2016 in the state of NH. These claims were ~88% for residents of NH and the remaining for out of state visitors who sought care in NH. Each claim consists of one or more line items, each indicating a procedure done during the doctor’s visit. Two columns indicating Billed amount and the Paid amount for the care provided, are of primary interest. The main objective is to predict “Paid amount per procedure” by mapping a plethora of features available in the dataset. It is also an expectation that you would create new features using the existing ones or external data sources. Objectives: Step 1: Take a random sample of 1 million unique claims, such that all line items related to each claim are included in the sample. This will result in a little less than 3 million rows of data. Step 2: Clean up the data, understand the distributions, and create new features if necessary. Step 3: Run predictive models using validation method of your choice. Step 4: Write a descriptive report (less than 10 pages) describing the process and your findings.
-
Updated
Jan 17, 2019 - Jupyter Notebook
Harvard Project - Accuracy improvement by adding seasonality premium pricing
python
linear-regression
scikit-learn
regularization
ridge-regression
k-fold
lasso-regression
airbnb-pricing-prediction
airbnb-dataset
premium-pricing
-
Updated
Dec 15, 2016 - Jupyter Notebook
Jupyter notebook that outlines the process of creating a machine learning predictive model. Predicts the peak "Wins Shared" by the current draft prospects based on numerous features such as college stats, projected draft pick, physical profile and age. I try out multiple models and pick the best performing one for the data from my judgement.
nba
machine-learning
neural-network
linear-regression
scikit-learn
ridge-regression
multi-layer-perceptron
nba-analytics
prospects
scikitlearn-machine-learning
nba-prediction
lasso-regression
support-vector-regression
regression-algorithms
college-basketball
-
Updated
Apr 17, 2018 - Jupyter Notebook
Analysis of NBA player stats and salaries of the 2016-17 for the 17-18 season
learning
player
nba
machine
regression
lasso
linear
nba-stats
nba-analytics
nba-visualization
nba-stats-api
elasticnet
nba-prediction
lasso-regression
ridge
nba-players
nba-data
find-undervalued-players
-
Updated
Aug 10, 2017 - Python
Nonparametric regression and prediction using the highly adaptive lasso algorithm
-
Updated
Jan 6, 2018 - R
Applied Machine Learning
python
machine-learning
r
tensorflow
svm
naive-bayes
linear-regression
machine-learning-algorithms
caret
regularization
ridge-regression
principal-component-analysis
principal-components
em-algorithm
nips
elasticnet
lasso-regression
iris-dataset
klar
convolu
-
Updated
Jun 15, 2016 - Python
Roger Ebert's movie ratings prediction
data-science
machine-learning
movies
linear-regression
k2-data-science
regression-models
movie-reviews
lasso-regression
movie-ratings
movie-rating-prediction
-
Updated
May 31, 2017 - Jupyter Notebook
Algorithms for Lasso estimation of NARMAX models.
-
Updated
Dec 30, 2019 - Julia
This repository contains only projects using regression analysis techniques. Examples include a comprehensive analysis of retail store expansion strategies using Lasso and Ridge regressions.
-
Updated
May 23, 2018 - Jupyter Notebook
Sequential adaptive elastic net (SAEN) approach, complex-valued LARS solver for weighted Lasso/elastic-net problems, and sparsity (or model) order detection with an application to single-snapshot source localization.
adaptive-learning
sparse-regression
matlab-toolbox
regularized-linear-regression
elastic-net
sparse-reconstruction
lasso-regression
source-localization
acoustic-model
regularization-paths
direction-of-arrival
sparse-regularization
compressed-beamforming
complex-valued-data
solution-path
-
Updated
Mar 5, 2020 - MATLAB
For quick search
-
Updated
Jan 31, 2019 - Python
jolars
commented
Jun 3, 2019
On the following line, I believe that the sparsity is destroyed by the subtraction.
https://github.com/jolars/sgdnet/blob/7e9261a83263a616cc07053c47f5cde5c2333cdb/src/utils.h#L70
It is not necessarily a big problem since it is only temporarily destroyed for each column, but could still probably be better implemented through a for loop.
Analyzes weightlifting videos for correct posture
opencv
machine-learning
computer-vision
random-forest
ensemble-learning
logistic-regression
ridge-regression
mpii-dataset
weightlifting
pose-estimation
elastic-net
lasso-regression
openpose
keypoint-detection
-
Updated
May 19, 2019 - Jupyter Notebook
This repository corresponds to the course "Statistical Learning Theory" taught at the School of Mathematics and Statistics (FME), UPC under the MESIO-UPC-UB Joint Interuniversity Master's Program under the instructor Pedro Delicado
machine-learning
statistics
statistical-learning
regularization
upc
ridge-regression
regression-models
regularized-linear-regression
lasso-regression
-
Updated
Jun 21, 2019 - Jupyter Notebook
Various LASSO type auto-regressive models
-
Updated
Jun 29, 2020 - C++
Machine-Learning-Regression
cross-validation
nearest-neighbor-search
gradient-descent
polynomial-regression
loss-functions
kernel-regression
lasso-regression
coordinate-descent
multiple-regression
overfitting
simple-linear-regression
ridge-regre
bias-variance-tradeoff
generalization-error
coefficient-path
-
Updated
Jul 19, 2018 - Jupyter Notebook
Source code for the project of my Master Thesis
python
natural-language-processing
twitter
svm
regression
tweepy
personality-traits
lasso-regression
wordembeddings
big5
-
Updated
Sep 26, 2018
Improve this page
Add a description, image, and links to the lasso-regression topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the lasso-regression topic, visit your repo's landing page and select "manage topics."
(see branch docs)
hcat(X, 1)IterativeSolvers.cgand notIterativeSolvers.lsqras it allows specifying the operator as a linear map which is efficient and avoids copying when having to add a column for X; anyway it should be identical apart from pathological cases