Classification of Smoking and Calling using Deep Learning
- URL: http://arxiv.org/abs/2012.08026v3
- Date: Sun, 12 Nov 2023 21:46:59 GMT
- Title: Classification of Smoking and Calling using Deep Learning
- Authors: Miaowei Wang, Alexander William Mohacey, Hongyu Wang, James Apfel
- Abstract summary: A pipeline is introduced to perform the classification of smoking and calling by modifying the pretrained V3.
Brightness enhancing based on deep learning is implemented to improve the classification of this classification task along with other useful training tricks.
- Score: 49.10965021800014
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Since 2014, very deep convolutional neural networks have been proposed and
become the must-have weapon for champions in all kinds of competition. In this
report, a pipeline is introduced to perform the classification of smoking and
calling by modifying the pretrained inception V3. Brightness enhancing based on
deep learning is implemented to improve the classification of this
classification task along with other useful training tricks. Based on the
quality and quantity results, it can be concluded that this pipeline with small
biased samples is practical and useful with high accuracy.
Related papers
- Hidden Classification Layers: Enhancing linear separability between
classes in neural networks layers [0.0]
We investigate the impact on deep network performances of a training approach.
We propose a neural network architecture which induces an error function involving the outputs of all the network layers.
arXiv Detail & Related papers (2023-06-09T10:52:49Z) - The Cascaded Forward Algorithm for Neural Network Training [61.06444586991505]
We propose a new learning framework for neural networks, namely Cascaded Forward (CaFo) algorithm, which does not rely on BP optimization as that in FF.
Unlike FF, our framework directly outputs label distributions at each cascaded block, which does not require generation of additional negative samples.
In our framework each block can be trained independently, so it can be easily deployed into parallel acceleration systems.
arXiv Detail & Related papers (2023-03-17T02:01:11Z) - Leveraging Angular Information Between Feature and Classifier for
Long-tailed Learning: A Prediction Reformulation Approach [90.77858044524544]
We reformulate the recognition probabilities through included angles without re-balancing the classifier weights.
Inspired by the performance improvement of the predictive form reformulation, we explore the different properties of this angular prediction.
Our method is able to obtain the best performance among peer methods without pretraining on CIFAR10/100-LT and ImageNet-LT.
arXiv Detail & Related papers (2022-12-03T07:52:48Z) - BatchFormer: Learning to Explore Sample Relationships for Robust
Representation Learning [93.38239238988719]
We propose to enable deep neural networks with the ability to learn the sample relationships from each mini-batch.
BatchFormer is applied into the batch dimension of each mini-batch to implicitly explore sample relationships during training.
We perform extensive experiments on over ten datasets and the proposed method achieves significant improvements on different data scarcity applications.
arXiv Detail & Related papers (2022-03-03T05:31:33Z) - The Devil is the Classifier: Investigating Long Tail Relation
Classification with Decoupling Analysis [36.298869931803836]
Long-tailed relation classification is a challenging problem as the head classes may dominate the training phase.
We propose a robust classifier with attentive relation routing, which assigns soft weights by automatically aggregating the relations.
arXiv Detail & Related papers (2020-09-15T12:47:00Z) - Active Deep Densely Connected Convolutional Network for Hyperspectral
Image Classification [6.850575514129793]
It is still very challenging to use only a few labeled samples to train deep learning models to reach a high classification accuracy.
An active deep-learning framework trained by an end-to-end manner is, therefore, proposed by this paper in order to minimize the hyperspectral image classification costs.
arXiv Detail & Related papers (2020-09-01T09:53:38Z) - Solving Long-tailed Recognition with Deep Realistic Taxonomic Classifier [68.38233199030908]
Long-tail recognition tackles the natural non-uniformly distributed data in realworld scenarios.
While moderns perform well on populated classes, its performance degrades significantly on tail classes.
Deep-RTC is proposed as a new solution to the long-tail problem, combining realism with hierarchical predictions.
arXiv Detail & Related papers (2020-07-20T05:57:42Z) - End-to-End Auditory Object Recognition via Inception Nucleus [7.22898229765707]
We propose a novel end-to-end deep neural network to map the raw waveform inputs to sound class labels.
Our network includes an "inception nucleus" that optimize the size of convolutional filters on the fly.
arXiv Detail & Related papers (2020-05-25T16:08:41Z) - Ensemble Wrapper Subsampling for Deep Modulation Classification [70.91089216571035]
Subsampling of received wireless signals is important for relaxing hardware requirements as well as the computational cost of signal processing algorithms.
We propose a subsampling technique to facilitate the use of deep learning for automatic modulation classification in wireless communication systems.
arXiv Detail & Related papers (2020-05-10T06:11:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.