Детальная информация

Разрешенные действия:

pdf/1728025.pdf
Действие 'Прочитать' будет доступно, если вы выполните вход в систему или будете работать с сайтом на компьютере в другой сети Действие 'Загрузить' будет доступно, если вы выполните вход в систему или будете работать с сайтом на компьютере в другой сети
epub/1728025.epub
Действие 'Загрузить' будет доступно, если вы выполните вход в систему или будете работать с сайтом на компьютере в другой сети

Группа: Анонимные пользователи

Сеть: Интернет

Права на использование объекта хранения

Место доступа Группа пользователей Действие
Локальная сеть ИБК СПбПУ Все Прочитать Печать Загрузить
Интернет Авторизованные пользователи СПбПУ Прочитать Печать Загрузить
-> Интернет Анонимные пользователи

Оглавление

  • Title Page
  • Copyright and Credits
  • Packt Upsell
  • Contributors
  • Table of Contents
  • Preface
  • Chapter 1: Getting Started with Machine Learning
    • What is AI?
    • The motivation behind ML
    • What is ML ?
    • Applications of ML
      • Digital signal processing (DSP)
      • Computer vision
      • Natural language processing (NLP)
      • Other applications of ML
    • Using ML to build smarter iOS applications
    • Getting to know your data
      • Features
        • Types of features
        • Choosing a good set of features
      • Getting the dataset
      • Data preprocessing
    • Choosing a model
      • Types of ML algorithms
      • Supervised learning
      • Unsupervised learning
      • Reinforcement learning
      • Mathematical optimization – how learning works
      • Mobile versus server-side ML
      • Understanding mobile platform limitations
    • Summary
    • Bibliography
  • Chapter 2: Classification – Decision Tree Learning
    • Machine learning toolbox
    • Prototyping the first machine learning app
      • Tools
      • Setting up a machine learning environment
    • IPython notebook crash course
    • Time to practice
    • Machine learning for extra-terrestrial life explorers
    • Loading the dataset
    • Exploratory data analysis
    • Data preprocessing
      • Converting categorical variables
      • Separating features from labels
      • One-hot encoding
      • Splitting the data
    • Decision trees everywhere
    • Training the decision tree classifier
      • Tree visualization
      • Making predictions
      • Evaluating accuracy
      • Tuning hyperparameters
      • Understanding model capacity trade-offs
    • How decision tree learning works
      • Building a tree automatically from data
      • Combinatorial entropy
      • Evaluating performance of the model with data
        • Precision, recall, and F1-score
        • K-fold cross-validation
        • Confusion matrix
    • Implementing first machine learning app in Swift
    • Introducing Core ML
      • Core ML features
      • Exporting the model for iOS
      • Ensemble learning random forest
      • Training the random forest
      • Random forest accuracy evaluation
      • Importing the Core ML model into an iOS project
      • Evaluating performance of the model on iOS
        • Calculating the confusion matrix
      • Decision tree learning pros and cons
    • Summary
  • Chapter 3: K-Nearest Neighbors Classifier
    • Calculating the distance
      • DTW
      • Implementing DTW in Swift
    • Using instance-based models for classification and clustering
    • People motion recognition using inertial sensors
    • Understanding the KNN algorithm
      • Implementing KNN in Swift
    • Recognizing human motion using KNN
      • Cold start problem
      • Balanced dataset
      • Choosing a good k
    • Reasoning in high-dimensional spaces
    • KNN pros
    • KNN cons
    • Improving our solution
      • Probabilistic interpretation
      • More data sources
      • Smarter time series chunking
      • Hardware acceleration
      • Trees to speed up the inference
      • Utilizing state transitions
    • Summary
    • Bibliography
  • Chapter 4: K-Means Clustering
    • Unsupervised learning
    • K-means clustering
    • Implementing k-means in Swift
      • Update step
      • Assignment step
    • Clustering objects on a map
    • Choosing the number of clusters
    • K-means clustering – problems
    • K-means++
    • Image segmentation using k-means
    • Summary
  • Chapter 5: Association Rule Learning
    • Seeing association rules
    • Defining data structures
    • Using association measures to assess rules
      • Supporting association measures
      • Confidence association measures
      • Lift association measures
      • Conviction association measures
    • Decomposing the problem
    • Generating all possible rules
    • Finding frequent item sets
    • The Apriori algorithm
    • Implementing Apriori in Swift
    • Running Apriori
    • Running Apriori on real-world data
    • The pros and cons of Apriori
    • Building an adaptable user experience
    • Summary
    • Bibliography
  • Chapter 6: Linear Regression and Gradient Descent
    • Understanding the regression task
    • Introducing simple linear regression
      • Fitting a regression line using the least squares method
        • Where to use GD and normal equation
        • Using gradient descent for function minimization
      • Forecasting the future with simple linear regression
    • Feature scaling
    • Feature standardization
      • Multiple linear regression
    • Implementing multiple linear regression in Swift
      • Gradient descent for multiple linear regression
        • Training multiple regression
        • Linear algebra operations
      • Feature-wise standardization
        • Normal equation for multiple linear regression
      • Understanding and overcoming the limitations of linear regression
    • Fixing linear regression problems with regularization
      • Ridge regression and Tikhonov regularization
        • LASSO regression
      • ElasticNet regression
    • Summary
    • Bibliography
  • Chapter 7: Linear Classifier and Logistic Regression
    • Revisiting the classification task
      • Linear classifier
      • Logistic regression
    • Implementing logistic regression in Swift
      • The prediction part of logistic regression
      • Training the logistic regression
      • Cost function
    • Predicting user intents
      • Handling dates
    • Choosing the regression model for your problem
    • Bias-variance trade-off
    • Summary
  • Chapter 8: Neural Networks
    • What are artificial NNs anyway?
    • Building the neuron
      • Non-linearity function
        • Step-like activation functions
        • Rectifier-like activation functions
    • Building the network
    • Building a neural layer in Swift
    • Using neurons to build logical functions
    • Implementing layers in Swift
    • Training the network
      • Vanishing gradient problem
      • Seeing biological analogies
    • Basic neural network subroutines (BNNS)
      • BNNS example
    • Summary
  • Chapter 9: Convolutional Neural Networks
    • Understanding users emotions
    • Introducing computer vision problems
    • Introducing convolutional neural networks
    • Pooling operation
    • Convolution operation
      • Convolutions in CNNs
    • Building the network
      • Input layer
      • Convolutional layer
      • Fully-connected layers
      • Nonlinearity layers
      • Pooling layer
      • Regularization layers
        • Dropout
        • Batch normalization
    • Loss functions
    • Training the network
    • Training the CNN for facial expression recognition
    • Environment setup
    • Deep learning frameworks
      • Keras
    • Loading the data
    • Splitting the data
    • Data augmentation
    • Creating the network
    • Plotting the network structure
    • Training the network
    • Plotting loss
    • Making predictions
    • Saving the model in HDF5 format
    • Converting to Core ML format
    • Visualizing convolution filters
    • Deploying CNN to iOS
    • Summary
    • Bibliography
  • Chapter 10: Natural Language Processing
    • NLP in the mobile development world
    • Word Association game
    • Python NLP libraries
    • Textual corpuses
    • Common NLP approaches and subtasks
      • Tokenization
      • Stemming
      • Lemmatization
      • Part-of-speech (POS) tagging
      • Named entity recognition (NER)
      • Removing stop words and punctuation
    • Distributional semantics hypothesis
    • Word vector representations
    • Autoencoder neural networks
    • Word2Vec
    • Word2Vec in Gensim
    • Vector space properties
    • iOS application
      • Chatbot anatomy
      • Voice input
      • NSLinguisticTagger and friends
      • Word2Vec on iOS
      • Text-to-speech output
      • UIReferenceLibraryViewController
      • Putting it all together
    • Word2Vec friends and relatives
    • Where to go from here?
    • Summary
  • Chapter 11: Machine Learning Libraries
    • Machine learning and AI APIs
    • Libraries
    • General-purpose machine learning libraries
      • AIToolbox
      • BrainCore
      • Caffe
      • Caffe2
      • dlib
      • FANN
      • LearnKit
      • MLKit
      • Multilinear-math
      • MXNet
      • Shark
      • TensorFlow
      • tiny-dnn
      • Torch
      • YCML
    • Inference-only libraries
      • Keras
      • LibSVM
      • Scikit-learn
      • XGBoost
    • NLP libraries
      • Word2Vec
      • Twitter text
    • Speech recognition
      • TLSphinx
      • OpenEars
    • Computer vision
      • OpenCV
      • ccv
      • OpenFace
      • Tesseract
    • Low-level subroutine libraries
      • Eigen
      • fmincg-c
      • IntuneFeatures
      • SigmaSwiftStatistics
      • STEM
      • Swix
      • LibXtract
      • libLBFGS
      • NNPACK
      • Upsurge
      • YCMatrix
    • Choosing a deep learning framework
    • Summary
  • Chapter 12: Optimizing Neural Networks for Mobile Devices
    • Delivering perfect user experience
    • Calculating the size of a convolutional neural network
    • Lossless compression
    • Compact CNN architectures
      • SqueezeNet
      • MobileNets
      • ShuffleNet
      • CondenseNet
    • Preventing a neural network from growing big
    • Lossy compression
      • Optimizing for inference
        • Network pruning
        • Weights quantization
        • Reducing precision
        • Other approaches
          • Facebook's approach in Caffe2
      • Knowledge distillation
      • Tools
    • An example of the network compression
    • Summary
    • Bibliography
  • Chapter 13: Best Practices
    • Mobile machine learning project life cycle
      • Preparatory stage
        • Formulate the problem
        • Define the constraints
        • Research the existing approaches
        • Research the data
        • Make design choices
      • Prototype creation
        • Data preprocessing
        • Model training, evaluation, and selection
        • Field testing
      • Porting or deployment for a mobile platform
      • Production
    • Best practices
      • Benchmarking
      • Privacy and differential privacy
      • Debugging and visualization
      • Documentation
    • Machine learning gremlins
      • Data kobolds
        • Tough data
        • Biased data
        • Batch effects
      • Goblins of training
      • Product design ogres
        • Magical thinking
        • Cargo cult
        • Feedback loops
        • Uncanny valley effect
    • Recommended learning resources
      • Mathematical background
        • Machine learning
        • Computer vision
        • NLP
    • Summary
  • Index

Статистика использования

pdf/1728025.pdf

stat Количество обращений: 0
За последние 30 дней: 0
Подробная статистика

epub/1728025.epub

stat Количество обращений: 0
За последние 30 дней: 0
Подробная статистика