Pinned Loading
-
LLaMA in 60 Lines
LLaMA in 60 Lines 1import json
2import pickle
3import struct
4import zipfile
5import numpy as np
-
RLAIF
RLAIF 1import torch
2from transformers import AutoModelForCausalLM, AutoTokenizer
34model = AutoModelForCausalLM.from_pretrained('NousResearch/Nous-Hermes-Llama2-13b', device_map = 'auto')
5tokenizer = AutoTokenizer.from_pretrained('NousResearch/Nous-Hermes-Llama2-13b')
-
-
deep-learning-keras-euroscipy2016
deep-learning-keras-euroscipy2016 PublicKeras@EuroScipy2016
Jupyter Notebook
-
HungaBunga
HungaBunga PublicHungaBunga: Brute-Force all sklearn models with all parameters using .fit .predict!
Python
-
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.




