Luciano Santa Brígida
lucianosb
AI & ML interests
LLM for pt-br (text generation, translation and classification), Image Generation and Image Classification.
Recent Activity
liked
a model
30 days ago
CohereForAI/aya-expanse-8b
Reacted to
ImranzamanML's
post
with 🧠
about 1 month ago
Today lets discuss about 32-bit (FP32) and 16-bit (FP16) floating-point!
Floating-point numbers are used to represent real numbers (like decimals) and they consist of three parts:
```
Sign bit:
Indicates whether the number is positive (0) or negative (1).
Exponent:
Determines the scale of the number (i.e., how large or small it is by shifting the decimal point).
Mantissa (or fraction):
Represents the actual digits of the number.
```
32-bit Floating Point (FP32)
Total bits: 32 bits
Sign bit: 1 bit
Exponent: 8 bits
Mantissa: 23 bits
For example:
A number like -15.375 would be represented as:
Sign bit: 1 (negative number)
Exponent: Stored after being adjusted by a bias (127 in FP32).
Mantissa: The significant digits after converting the number to binary.
16-bit Floating Point (FP16)
Total bits: 16 bits
Sign bit: 1 bit
Exponent: 5 bits
Mantissa: 10 bits
Example:
A number like -15.375 would be stored similarly:
Sign bit: 1 (negative number)
Exponent: Uses 5 bits, limiting the range compared to FP32.
Mantissa: Only 10 bits for precision.
Precision and Range
FP32: Higher precision and larger range, with about 7 decimal places of accuracy.
FP16: Less precision (around 3-4 decimal places), smaller range but faster computations and less memory use.
Reacted to
ImranzamanML's
post
with 🔥
about 1 month ago
Today lets discuss about 32-bit (FP32) and 16-bit (FP16) floating-point!
Floating-point numbers are used to represent real numbers (like decimals) and they consist of three parts:
```
Sign bit:
Indicates whether the number is positive (0) or negative (1).
Exponent:
Determines the scale of the number (i.e., how large or small it is by shifting the decimal point).
Mantissa (or fraction):
Represents the actual digits of the number.
```
32-bit Floating Point (FP32)
Total bits: 32 bits
Sign bit: 1 bit
Exponent: 8 bits
Mantissa: 23 bits
For example:
A number like -15.375 would be represented as:
Sign bit: 1 (negative number)
Exponent: Stored after being adjusted by a bias (127 in FP32).
Mantissa: The significant digits after converting the number to binary.
16-bit Floating Point (FP16)
Total bits: 16 bits
Sign bit: 1 bit
Exponent: 5 bits
Mantissa: 10 bits
Example:
A number like -15.375 would be stored similarly:
Sign bit: 1 (negative number)
Exponent: Uses 5 bits, limiting the range compared to FP32.
Mantissa: Only 10 bits for precision.
Precision and Range
FP32: Higher precision and larger range, with about 7 decimal places of accuracy.
FP16: Less precision (around 3-4 decimal places), smaller range but faster computations and less memory use.
Organizations
lucianosb's activity
Adds Age Classifier and NSFW Classifier
1
#1 opened 4 months ago
by
lucianosb
New AI bias detection tool for artificial images: Test skin tone and gender bias instantly
6
#7 opened 4 months ago
by
fdaudens
Adding the Open Portuguese LLM Leaderboard Evaluation Results
#1 opened 4 months ago
by
leaderboard-pt-pr-bot
Adding the Open Portuguese LLM Leaderboard Evaluation Results
#1 opened 4 months ago
by
leaderboard-pt-pr-bot
Adding the Open Portuguese LLM Leaderboard Evaluation Results
#1 opened 5 months ago
by
leaderboard-pt-pr-bot
Adding the Open Portuguese LLM Leaderboard Evaluation Results
#1 opened 6 months ago
by
leaderboard-pt-pr-bot
Algorithmic bias
1
#4 opened 7 months ago
by
fdaudens
Hi! Introduce yourself! 👋
20
#2 opened 7 months ago
by
fdaudens
Adding the Open Portuguese LLM Leaderboard Evaluation Results
#1 opened 7 months ago
by
leaderboard-pt-pr-bot
Consultoria
2
#1 opened 7 months ago
by
GustavoFlore
Link para conversão em safetensors
1
#8 opened 8 months ago
by
lucianosb
Adding the Open Portuguese LLM Leaderboard Evaluation Results
#2 opened 8 months ago
by
leaderboard-pt-pr-bot
Adding `safetensors` variant of this model
1
#1 opened 8 months ago
by
lucianosb
Error: 'NoneType' object has no attribute 'pr_url'
6
#4 opened 8 months ago
by
kzleong
Solarpunk Futuristic Houses.
#45 opened about 1 year ago
by
lucianosb
Uso de Memória Estimado
#6 opened about 1 year ago
by
lucianosb
Quantização do modelo
#5 opened about 1 year ago
by
lucianosb
How to use it with GPT4all ?
4
#2 opened over 1 year ago
by
eshaanagarwal
I got it running on Colab
3
#3 opened over 1 year ago
by
lucianosb
Add instructions for inference
#1 opened over 1 year ago
by
lucianosb