首页  手机版添加桌面!

Coursera - Neural Networks and Machine Learning, Geoffrey Hinton University of Toronto

CourseraNeuralNetworksMachineLearningGeoffreyHintonUniversityToronto

种子大小:532.59 MB

收录时间:2014-01-15

磁力链接:

磁力链接  蜘蛛资源  磁力引擎  网盘资源  影视资源  云盘资源  磁力狗狗  免费小说  美女图片 

文件列表:49File

  1. 5 - 4 - Convolutional nets for object recognition [17min].mp423.03 MB
  2. 7 - 1 - Modeling sequences A brief overview.mp420.13 MB
  3. 5 - 3 - Convolutional nets for digit recognition [16 min].mp418.46 MB
  4. 2 - 5 - What perceptrons cant do [15 min].mp416.57 MB
  5. 8 - 2 - Modeling character strings with multiplicative connections [14 mins].mp416.56 MB
  6. 8 - 1 - A brief overview of Hessian Free optimization.mp416.24 MB
  7. 10 - 1 - Why it helps to combine models [13 min].mp415.12 MB
  8. 6 - 5 - Rmsprop Divide the gradient by a running average of its recent magnitude.mp415.12 MB
  9. 1 - 1 - Why do we need machine learning [13 min].mp415.05 MB
  10. 10 - 2 - Mixtures of Experts [13 min].mp414.98 MB
  11. 6 - 2 - A bag of tricks for mini-batch gradient descent.mp414.9 MB
  12. 4 - 1 - Learning to predict the next word [13 min].mp414.28 MB
  13. 4 - 5 - Ways to deal with the large number of possible outputs [15 min].mp414.26 MB
  14. 8 - 3 - Learning to predict the next character using HF [12 mins].mp413.92 MB
  15. 9 - 1 - Overview of ways to improve generalization [12 min].mp413.57 MB
  16. 3 - 1 - Learning the weights of a linear neuron [12 min].mp413.52 MB
  17. 3 - 4 - The backpropagation algorithm [12 min].mp413.35 MB
  18. 9 - 5 - The Bayesian interpretation of weight decay [11 min].mp412.27 MB
  19. 9 - 4 - Introduction to the full Bayesian approach [12 min].mp412 MB
  20. 8 - 4 - Echo State Networks [9 min].mp411.28 MB
  21. 3 - 5 - Using the derivatives computed by backpropagation [10 min].mp411.15 MB
  22. 7 - 5 - Long-term Short-term-memory.mp410.23 MB
  23. 1 - 2 - What are neural networks [8 min].mp49.76 MB
  24. 6 - 3 - The momentum method.mp49.74 MB
  25. 10 - 5 - Dropout [9 min].mp49.69 MB
  26. 6 - 1 - Overview of mini-batch gradient descent.mp49.6 MB
  27. 2 - 2 - Perceptrons The first generation of neural networks [8 min].mp49.39 MB
  28. 1 - 3 - Some simple models of neurons [8 min].mp49.26 MB
  29. 1 - 5 - Three types of learning [8 min].mp48.96 MB
  30. 4 - 4 - Neuro-probabilistic language models [8 min].mp48.93 MB
  31. 7 - 4 - Why it is difficult to train an RNN.mp48.89 MB
  32. 2 - 1 - Types of neural network architectures [7 min].mp48.78 MB
  33. 9 - 3 - Using noise as a regularizer [7 min].mp48.48 MB
  34. 10 - 3 - The idea of full Bayesian learning [7 min].mp48.39 MB
  35. 10 - 4 - Making full Bayesian learning practical [7 min].mp48.13 MB
  36. 4 - 3 - Another diversion The softmax output function [7 min].mp48.03 MB
  37. 9 - 2 - Limiting the size of the weights [6 min].mp47.36 MB
  38. 7 - 2 - Training RNNs with back propagation.mp47.33 MB
  39. 2 - 3 - A geometrical view of perceptrons [6 min].mp47.32 MB
  40. 7 - 3 - A toy example of training an RNN.mp47.24 MB
  41. 5 - 2 - Achieving viewpoint invariance [6 min].mp46.89 MB
  42. 6 - 4 - Adaptive learning rates for each connection.mp46.63 MB
  43. 1 - 4 - A simple example of learning [6 min].mp46.57 MB
  44. 2 - 4 - Why the learning works [5 min].mp45.9 MB
  45. 3 - 2 - The error surface for a linear neuron [5 min].mp45.89 MB
  46. 5 - 1 - Why object recognition is difficult [5 min].mp45.37 MB
  47. 4 - 2 - A brief diversion into cognitive science [4 min].mp45.31 MB
  48. 9 - 6 - MacKays quick and dirty method of setting weight costs [4 min].mp44.37 MB
  49. 3 - 3 - Learning the weights of a logistic output neuron [4 min].mp44.37 MB