What does "parameter" refer to in a neural network?

Asked 1 years ago, Updated 1 years ago, 117 views

It's more about how it works than programming.

I don't really understand the ten laws of machine learning, the "parameters" in the sentence "Get 10 times the number of parameters in your network."

Do you mean the input layer?
Do you mean the number of intermediate layer combinations?
Is it something else that I don't know?

I would appreciate it if you could give me an answer and a detailed explanation.

machine-learning deep-learning neural-network

2022-09-30 20:18

1 Answers

There was a page explaining the same claim, "Prepare 10 times the number of parameters in your network (=the rule of 10)" with source code, and I will answer based on that.

How much training data do you need?
Tensorflow-Projects/training_data_exploration.py

At the link, the author performs logistic regression with multiple parameters, and the results conclude that learning data is 10 times the number of parameters to make it a more accurate model.

Source code for that part

#Construct a logistic regression model.Alternatively, one can use
  # tf.nn.softmax_cross_entropy_with_logits() for compactness.This version
  # makes the model explicit by exposing the basic units.
  W=tf.Variable(tf.random_uniform([1, num_parameters], -1.0, 1.0))
  y = tf.sigmoid(tf.matmul(W,x_data)) 

Have you ever seen W in a neural network?
This is W (both = weight, weight, and のSo a parameter is the weight of the network (=W).

Machine learning usually aims to minimize the following hypothetical functions by performing linear regression (the expression is the simplest sample).

src=

This 00, 11 trains and optimizes the machine learning model.

Use Case
((W) represents the feature value of a machine learning model, so it is an element of data when you try to solve something by machine learning. (If you try to get house rent from various data, the feature value will be the number of rooms, bathrooms, kitchens, and houses next to highways.)
This law applies to the amount of training sets when increasing feature values.

Note: What does theta mean in machine learning algorithms?

Edit
Apparently, this rule of thumb is known as Uncle Bernie's rule.

Note: What is Uncle Bernie's rule?


2022-09-30 20:18

If you have any answers or tips


© 2024 OneMinuteCode. All rights reserved.