When inheriting two classes in Python, where and how should I write super?

Asked 2 years ago, Updated 2 years ago, 18 views

Inherit MiddleLayer and Outputlayer classes with initialize?I'd like to use it, but I don't really understand.
My goal is to change the number of layers (50, 3) and (3, 50) of Middlelayer and Outputlayer in the initialize constructor, but I don't know where and how to write super... for the inheritance of two classes.
The program works just in case.

(Also, I'd like to do additional things to show y in OutputLayer,
When you run the last two lines,

AttributeError: 'function' object has no attribute'y'

will appear.

Add

The last two lines have improved.)

class initialize (MiddleLayer, OutputLayer):
    
    def_init__(self, data_input, data_correct, number_continu, eta):
        self.input=data_input
        self.correct=data_correct
        self.m = MiddleLayer(50,3)
        self.o=OutputLayer(3,50,0.1)        
        self.number_continu=number_continu
        self.middle_affine=np.random.randn(1,3)
        self.output=np.random.randn(50) 
        self.eta=eta
    def learning (self):
        for i in range (self.number_continu):
            self.m.forward (self.input.reshape (1,50))
            self.middle_affine=self.m.y
            self.o.forward (self.m.y)
            self.output =self.o.y.reshape(-1)
            self.o.backward (self.correct.reshape (1,50))
            self.m.backward (self.o.grad_x)
            self.m.update(self.eta)
            self.o.update(self.eta)
    
    defplot_sin(self):
        config=plt.figure()
        ax=fig.add_subplot(111)
        data=self.output
        ax.plot(data)
        
        plt.show()

class MiddleLayer:
    def_init__(self, n_upper=50, n=3, wb_width=0.1):
        self.w=wb_width*np.random.randn(n_upper,n)#Weight (matrix)
        self.b = wb_width*np.random.randn(n)# bias (vector)
    def forward (self, x):
        self.x = x
        self.u=np.dot(x,self.w) + self.b
        self.y = np.maximum(0,self.u)
#        self.y = 1/(1+np.exp(-self.u))# sigmoid function
    def backward(self,grad_y):
        delta=grad_y*np.where (self.y<=0,0,1)
#        delta=grad_y*(1-self.y)*self.y        
        self.grad_w=np.dot(self.x.T, delta)
        self.grad_b=np.sum( delta,axis=0)
        self.grad_x = np.dot ( delta, self.w.T ) 
    default (self, eta):
        self.w -=eta*self.grad_w
        self.b - =eta*self.grad_b
    
        
        
        
# --Output layer--
classOutputLayer:
    def_init__(self, n_upper=3, n=50, wb_width=0.1):
        self.w=wb_width*np.random.randn(n_upper,n)#Weight (matrix)
        self.b = wb_width*np.random.randn(n)# bias (vector)
    def forward (self, x):
        self.x = x
        u = np.dot(x,self.w) + self.b
        self.y=u#equal function
    def backward (self, t):
        delta = self.y -t
        self.grad_w=np.dot(self.x.T, delta)
        self.grad_b=np.sum( delta,axis=0)
        self.grad_x = np.dot ( delta, self.w.T ) 
    default (self, eta):
        self.w -=eta*self.grad_w
        self.b - =eta*self.grad_b

test = initialize (data_input, correct, 10.1)
test.learning()
test.plot_sin()

python

2022-09-29 21:33

1 Answers

As for multiple inheritance of Python classes, I think the article [Python introductory] multiple inheritance (1/2) of @IT is easy to understand.

For the question, "I don't know where and how to write super... for inheritance between two classes," specify the parent class name, not the super name.

super.__init__(...)

So I don't know which class __init__ will be called (at least it's not easy to understand code), but

MiddleLayer.__init__(...)

Then it is obvious that the MiddleLayer class __init__ will be called.

==
In multiple inheritance, when multiple parent classes have the same name method (as in _init__ above), which class takes precedence is determined by an algorithm called "C3 linearization".


2022-09-29 21:33

If you have any answers or tips


© 2024 OneMinuteCode. All rights reserved.