bree7246 bree7246
  • 19-08-2021
  • Computers and Technology
contestada

In the Gradient Descent algorithm, we are more likely to reach the global minimum, if the learning rate is selected to be a large value.

a. True
b. False

Respuesta :

oddkevin9
oddkevin9 oddkevin9
  • 19-08-2021
False iiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii
Answer Link
swan2414 swan2414
  • 23-08-2021

Answer:

false i think.

Explanation:

Gradient Descent is more likely to reach a local minima. because starting at different points and just in general having a different starting point, will lead us to a different local minimum( aka the lowest point closest to the starting point). if alpha(the learning rate) is too large, gradient descent may fail to converge and may even diverge.

Answer Link

Otras preguntas

is most of africa desert?
A color printer prints 35 pages in 10 minutes. How many pages does it print per minute?
The b.a.c can rise significantly within ? minutes after having a drink
Amy and raymond live together in an intimate relationship without formal legal or religious sanctioning. this arrangement is known by sociologists as:
A circle is centered at the point (-3,2) and passes through the point (1,5). What's the radius
Help please!!!!!!! What are some ways that you can be assured of your salvation? Use complete sentences.
What was one reason that anti-immigration hysteria swept the nation?
Which observation supports the idea that light is made of particles? A.Light is produced by the interaction of magnetic and electric fields. B.Light spreads
How many grams of carbon dioxide will form if 5.5 g of C3H8 burns in 15 g of O2?
Which phrase best defines " quotation "