Dipika
dsikka
AI & ML interests
None yet
Recent Activity
updated
a model
about 10 hours ago
nm-testing/Meta-Llama-3-8B-Instruct-AttnQuantOnly
published
a model
about 11 hours ago
nm-testing/Meta-Llama-3-8B-Instruct-AttnQuantOnly
updated
a model
about 11 hours ago
nm-testing/Meta-Llama-3-8B-FP8-AttnQuant-WeightQuant
Organizations
dsikka's activity
New activity in
neuralmagic/Sparse-Llama-3.1-8B-ultrachat_200k-2of4-quantized.w4a16
about 2 months ago
Model only outputs "!!!!!!!!!!"
1
#1 opened about 2 months ago
by
mrhendrey
Update README.md
2
#2 opened 3 months ago
by
shariqmobin
What engine should be used to infer this model?
7
#1 opened 5 months ago
by
RobertLiu0905
How to download the model with transformer library
5
#6 opened 4 months ago
by
Rick10
How to download the model with transformer library
5
#6 opened 4 months ago
by
Rick10