I want to implement a log-uniform distribution class, whose functionality is closely mimic the https://www.tensorflow.org/probability/api_docs/python/tfp/distributions/Uniform. Using the chatgpt, I have the following example code,
import tensorflow_probability as tfp
import tensorflow as tf
tfd = tfp.distributions
# Define the log-uniform distribution
class LogUniform(tfd.TransformedDistribution):
def __init__(self, low, high, validate_args=False, name='LogUniform'):
base_distribution = tfd.Uniform(low=tf.math.log(low), high=tf.math.log(high))
super(LogUniform, self).__init__(base_distribution, tfp.bijectors.Exp(), validate_args=validate_args, name=name)
# Usage example
low = 0.1
high = 1.0
loguniform = LogUniform(low, high)
samples = loguniform.sample(10000).numpy()
# print(tf.reduce_max(samples))
from matplotlib import pyplot as plt
import numpy as np
plt.figure()
plt.hist(np.log10(samples), bins='auto')
plt.show()
Above code seems to works, but it give me the following warning when executing,
WARNING:root:
Distribution subclass LogUniform inherits `_parameter_properties from its parent (TransformedDistribution)
while also redefining `__init__`. The inherited annotations cover the following
parameters: dict_keys(['distribution', 'bijector']). It is likely that these do not match the subclass parameters.
This may lead to errors when computing batch shapes, slicing into batch
dimensions, calling `.copy()`, flattening the distribution as a CompositeTensor
(e.g., when it is passed or returned from a `tf.function`), and possibly other
cases. The recommended pattern for distribution subclasses is to define a new
`_parameter_properties` method with the subclass parameters, and to store the
corresponding parameter values as `self._parameters` in `__init__`, after
calling the superclass constructor:
class MySubclass(tfd.SomeDistribution):
def __init__(self, param_a, param_b):
parameters = dict(locals())
# ... do subclass initialization ...
super(MySubclass, self).__init__(**base_class_params)
# Ensure that the subclass (not base class) parameters are stored.
self._parameters = parameters
def _parameter_properties(self, dtype, num_classes=None):
return dict(
# Annotations may optionally specify properties, such as `event_ndims`,
# `default_constraining_bijector_fn`, `specifies_shape`, etc.; see
# the `ParameterProperties` documentation for details.
param_a=tfp.util.ParameterProperties(),
param_b=tfp.util.ParameterProperties())
I am new to TensorFlow and am having trouble fixing a warning I encountered. As a beginner, I do not fully understand the issue yet. Does anyone have any suggestions that could help me resolve this warning? Any advice would be greatly appreciated!
The anwser could help me to fix the waning in the code. And providing an open template for implementing the log-uniform distribution in tensflow to the whole community.