compat.Pytorch2River #972
-
hello all, Thank you in advance. from river import compat
from river import datasets
from river import evaluate
from river import metrics
from river import preprocessing
from torch import nn
from torch import optim
import torch
import pandas as pd
from river import stream
import itertools
_ = torch.manual_seed(0)
n_features=3
def build_torch_mlp_regressor(n_features):
net = nn.Sequential(
nn.Linear(n_features,45),
nn.Linear(45,50),
nn.Linear(50,50),
nn.Linear(50,3)
)
return net
model = compat.PyTorch2RiverRegressor(build_fn=build_torch_mlp_regressor(n_features),
loss_fn=nn.MSELoss,
optimizer_fn=optim.Adam)
df = pd.read_csv('/content/drive/MyDrive/data/Final_data/one_lakh_more_accurate.csv')
train_size = 100
for xb in pd.read_csv('/content/drive/MyDrive/data/Final_data/soft_robotics_2_scaled.csv', chunksize=50):
y = xb.loc[:,'X':'Z']
x = xb.loc[:,'P_set_1':'P_set_3']
model.predict_one(x)
model.learn_one(x,y) Error I got T
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Basically model = compat.PyTorch2RiverRegressor(build_fn=build_torch_mlp_regressor
loss_fn=nn.MSELoss,
optimizer_fn=optim.Adam) Why is this? Because in online learning you the user should never have to worry about the number of features in the data stream. Models should handle this for you. Note that we'll be removing PyTorch compatibility in the next release. It will instead be located in river-extra. |
Beta Was this translation helpful? Give feedback.
-
Take a look at
No, it doesn't support that. But I encourage you to consider the |
Beta Was this translation helpful? Give feedback.
Take a look at
iter_csv
as well asiter_pandas
.No, it doesn't support that. But I encourage you to consider the
multioutput
module. It contains model wrappers, which you can wrap around any regression model, which includesPyTorch2RiverRegressor
.