repo_name
stringclasses
28 values
pr_number
int64
1.86k
122k
pr_title
stringlengths
5
204
author
stringlengths
3
58
git_commit_prev
stringlengths
40
40
git_commit_curr
stringlengths
40
40
date_created
stringlengths
25
25
date_merged
stringlengths
25
25
query
stringlengths
12
65.6k
context_file_path
stringlengths
6
233
label
int64
-1
1
language
stringclasses
5 values
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/models/sequential.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./guides/sequential_model.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/utils/audio_dataset_utils_test.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/legacy/saving/serialization.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./benchmarks/model_benchmark/image_classification_benchmark.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/layers/normalization/spectral_normalization.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/layers/preprocessing/category_encoding.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./examples/keras_io/tensorflow/structured_data/structured_data_classification_from_scratch.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/trainers/data_adapters/array_data_adapter_test.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/optimizers/adamw.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./examples/keras_io/vision/siamese_contrastive.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/layers/reshaping/repeat_vector_test.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/legacy/layers.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/models/variable_mapping.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/layers/reshaping/permute.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/backend/torch/trainer.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/layers/core/embedding.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/layers/convolutional/conv_test.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/testing/test_utils_test.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/layers/pooling/global_max_pooling2d.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/regularizers/regularizers.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/utils/summary_utils.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/layers/preprocessing/index_lookup.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/layers/preprocessing/discretization_test.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/layers/core/dense_test.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/layers/rnn/conv_lstm.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/backend/torch/core.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/applications/vgg16.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/layers/preprocessing/random_translation.py
-1
python
keras-team/keras
18,928
Fix descrepancies in Conv Module docstrings regarding data_format
khdlr
19184e9a80f408e6812f0a21298892a25b14bf14
7d431cea42d13f189473f6c0d2d33eaa5228c40f
2023-12-12 02:07:05+00:00
2023-12-12 17:23:20+00:00
Fix descrepancies in Conv Module docstrings regarding data_format. Stumbled upon some inconsistencies in the docs for conv layers, e.g. Conv2d, which currently states > `"channels_last"` corresponds to inputs with shape `(batch_size, channels, height, width)` which seems wrong, especially since a bit further down, it mentions > Input Shape: If `data_format="channels_last"`: A 4D tensor with shape: `(batch_size, height, width, channels)` Similar inconsisties exist for other conv layers, namely conv3d, conv3d_transpose, depthwise_conv2d, separable_conv2d. So here's a simple Doc Fix to get the documentation for the `data_format` argument to conv layers in line with the expected input shape.
./keras/backend/torch/optimizers/torch_sgd.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/jax/random.py
1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/torch/random.py
1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/numpy/random.py
1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/random/random_test.py
1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/tensorflow/random.py
1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/random/random.py
1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/preprocessing/hashing_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/tensorflow/saved_model_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/losses/losses.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/rnn/gru_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/trainers/data_adapters/data_adapter.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/tensorflow/distribute_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/optimizers/adafactor.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/constraints/constraints.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./examples/keras_io/tensorflow/audio/uk_ireland_accent_recognition.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/pooling/max_pooling1d.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/initializers/constant_initializers.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/activations/activation_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/convolutional/conv3d_transpose.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/ops/numpy.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/jax/__init__.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/trainers/compile_utils.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/models/model_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/preprocessing/random_zoom.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/trainers/__init__.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/convolutional/conv1d_transpose.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/numpy/__init__.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/regularization/activity_regularization.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/metrics/metrics_utils.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/rnn/bidirectional_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/merging/base_merge.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/common/global_state.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./examples/keras_io/tensorflow/generative/deep_dream.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/constraints/constraints_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./examples/keras_io/vision/object_detection_using_vision_transformer.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/callbacks/callback.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/pooling/max_pooling2d.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/rnn/conv_lstm_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/utils/shape_utils.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/optimizers/nadam_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/torch/trainer.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/saving/saving_lib.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./benchmarks/model_benchmark/image_classification_benchmark.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/legacy/saving/saving_utils.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./examples/demo_functional.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/merging/average.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/core/__init__.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/pooling/max_pooling3d.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/trainers/data_adapters/torch_data_loader_adapter_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./benchmarks/layer_benchmark/conv_benchmark.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./examples/keras_io/tensorflow/keras_recipes/tensorflow_numpy_models.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/legacy/saving/__init__.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/legacy/layers.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/torch/optimizers/torch_parallel_optimizer.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/numpy/trainer.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./examples/keras_io/structured_data/tabtransformer.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/applications/inception_v3.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/normalization/unit_normalization_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/tensorflow/__init__.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./requirements.txt
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./examples/keras_io/tensorflow/keras_recipes/trainer_pattern.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./examples/keras_io/tensorflow/vision/swim_transformers.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./guides/transfer_learning.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/preprocessing/resizing.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/layers/preprocessing/hashed_crossing_test.py
-1
python
keras-team/keras
18,920
Implement `binomial` and `beta` distribution functions in `keras.random`
KhawajaAbaid
c5631ee48460b40f5280f7ff38aa5d3614784525
170a47ac8f98b79a5daede24932bb360dca2c981
2023-12-10 04:32:48+00:00
2023-12-11 20:41:33+00:00
Implement `binomial` and `beta` distribution functions in `keras.random`. Following up on the issue https://github.com/keras-team/keras/issues/18918 - Implement `binomial` and `beta` distribution functions in all backends currently supported by Keras namely TensorFlow, Jax, PyTorch and Numpy. - Add unit tests for each of these functions Importantly, As tensorflow doesn't offer a built-in method for beta function so I've implemented a workaround using a statistical formula to use gamma distributed random variables to derive beta distributed random variable. Specifically, $U(a, b) = X(a) / (X(a) + Y(b))$ where $U(a,b)$ is the beta distributed random variable using parameters $a$ and $b$ and $X(a)$ and $Y(b)$ are gamma-distributed random variables using parameter $a$ and $b$ respectively.
./keras/backend/torch/optimizers/__init__.py
-1
python
keras-team/keras
18,876
Ensure dtype consistency for activation functions
james77777778
ec49bc1be737cd4170093f44fb7b76251a3a35dd
92f4d17d9fa1553504dc132d3e8a2fb6d0077551
2023-12-03 13:03:58+00:00
2023-12-04 17:51:37+00:00
Ensure dtype consistency for activation functions. This PR includes the following: 1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu 2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish` 3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`) 4. Add dtype tests for all existing activation functions
./keras/activations/activations_test.py
1
python
keras-team/keras
18,876
Ensure dtype consistency for activation functions
james77777778
ec49bc1be737cd4170093f44fb7b76251a3a35dd
92f4d17d9fa1553504dc132d3e8a2fb6d0077551
2023-12-03 13:03:58+00:00
2023-12-04 17:51:37+00:00
Ensure dtype consistency for activation functions. This PR includes the following: 1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu 2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish` 3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`) 4. Add dtype tests for all existing activation functions
./keras/ops/nn.py
1
python
keras-team/keras
18,876
Ensure dtype consistency for activation functions
james77777778
ec49bc1be737cd4170093f44fb7b76251a3a35dd
92f4d17d9fa1553504dc132d3e8a2fb6d0077551
2023-12-03 13:03:58+00:00
2023-12-04 17:51:37+00:00
Ensure dtype consistency for activation functions. This PR includes the following: 1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu 2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish` 3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`) 4. Add dtype tests for all existing activation functions
./keras/activations/activations.py
1
python
keras-team/keras
18,876
Ensure dtype consistency for activation functions
james77777778
ec49bc1be737cd4170093f44fb7b76251a3a35dd
92f4d17d9fa1553504dc132d3e8a2fb6d0077551
2023-12-03 13:03:58+00:00
2023-12-04 17:51:37+00:00
Ensure dtype consistency for activation functions. This PR includes the following: 1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu 2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish` 3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`) 4. Add dtype tests for all existing activation functions
./keras/activations/__init__.py
1
python