Cannot find reference 'math' in 'init.py | init.py'

This problem occurs on the basis of pycharm.
I don’t know how to solve the problem with this laptop.
I used colab just in case. It works just right, but I want to use this laptop. Please tell me how to solve the problem.
I’m sorry that I’m not good at English.
from tensorflow.math import exp, maximum

tensorflow-cpu2.7.0 (pip)

code
import tensorflow as tf

from tensorflow.math import exp, maximum

from tensorflow.keras.layers import Activation

x = tf.random.normal(shape=(1, 5))

sigmoid = Activation(‘sigmoid’)

tanh = Activation(‘tanh’)

relu = Activation(‘relu’)

y_sigmoid_tf = sigmoid(x)

y_tanh_tf = tanh(x)

y_relu_tf = relu(x)

y_sigmoid_man = 1 / (1 + exp(-x))

y_tanh_man = (exp(x) - exp(-x)) / (exp(x) + exp(-x))

y_relu_man = maximum(x, 0)

print(“x: {}\n{}”.format(x.shape, x.numpy()))

print(“sigmoid(Tensorflow): {}\n{}”.format(y_sigmoid_tf.shape, y_sigmoid_tf.numpy()))

print(“sigmoid(manual): {}\n{}”.format(y_sigmoid_man.shape, y_sigmoid_man.numpy()))

print(“Tanh(Tensorflow): {}\n{}”.format(y_tanh_tf.shape, y_tanh_tf.numpy()))

print(“Tanh(manual): {}\n{}\n”.format(y_tanh_man.shape, y_tanh_man.numpy()))

print(“ReLU(Tensorflow): {}\n{}”.format(y_relu_tf.shape, y_relu_tf.numpy()))

print(“ReLU(manual): {}\n{}\n”.format(y_relu_man.shape, y_relu_man.numpy()))
Result
x: (1, 5)
[[-1.1591758 2.2604184 -0.5664556 -2.350039 -0.1003536]]
sigmoid(Tensorflow): (1, 5)
[[0.2388171 0.9055454 0.3620551 0.08706263 0.47493264]]
sigmoid(manual): (1, 5)
[[0.2388171 0.9055454 0.3620551 0.08706267 0.47493264]]
Tanh(Tensorflow): (1, 5)
[[-0.82077116 0.9784743 -0.5127515 -0.98197484 -0.10001805]]
Tanh(manual): (1, 5)
[[-0.82077116 0.9784743 -0.5127515 -0.9819748 -0.10001804]]

ReLU(Tensorflow): (1, 5)
[[0. 2.2604184 0. 0. 0. ]]
ReLU(manual): (1, 5)
[[0. 2.2604184 0. 0. 0. ]]

The error is based on a static code analysis, but the code itself is actually evaluated with no runtime errors, so you can ignore this message. And yes, we use the same code analysis tool as in PyCharm so the behaviour is similar here.

Thank you. It was solved thanks to you. :slight_smile: