Skip to content
Features
Replit AI
Collaboration
CDE
Deployments
Teams
Pricing
Guides
Blog
Careers
Log in
Start building
Features
Replit AI
Collaboration
CDE
Deployments
Teams
Pricing
Guides
Blog
Careers
Log in
Start building
Morgan McGuire
@MorganMcGuire
Public Apps
wandb hackathon - gpt4me
2 years ago
wandb hackathon - LLMs for Wandb and Replit
2 years ago
This REPL is a Proof of Concept to how we can LLMs with Python code to scale human effort. For human decision of choosing a Regressor, the rest of the boiler plate code gets reused and the ready code is saved in a new file. The configs can be changed via chat/natural language as well.
Tutorial Tutorial
2 years ago
You've got a great idea for a learning experience on Replit - and we've got a tutorial feature! Learn how to add a tutorial to any Repl - that's right, literally ANY Repl!
nanoGPT in Replit
2 years ago
5 likes
7 forks
Flask
2 years ago
Minimalist Python web framework.
wandb-training
2 years ago
LavenderJuniorDigit
2 years ago
Weights and Biases Tutorial
3 years ago
A simple Replit to train a classifier using PyTorch on a toy dataset with baked in goodness of Weights and Biases. Click run to get a welcome page. You will be asked to enter a dropout rate. You can also enter your W&B account's API token to log the run to your Weights and Biases account. If you haven't already, you can create a free W&B account by visiting https://wandb.ai/signup. The model will train with your dropout rate and log the metrics to a W&B run page. You can find the link to the run on the terminal. Compare multiple dropout rates and see if it's helping train better models. This tutorial was inspired by this W&B Report: https://wandb.ai/authors/ayusht/reports/Implementing-Dropout-in-PyTorch-With-Example--VmlldzoxNTgwOTE I have heavily used Rich to create terminal magic. There might be hiccup in the UI, you can consider running this in full screen. PS: If you are a rich expert do give feedback to improve it further. :)
DarkseagreenLightpinkBootstrapping
3 years ago
testing
4 years ago