The Hackathons starts in less than a week from now, so we can announce this:
#1 skyhacks edition challenges will be around:
1. Computer Vision
Further on we will let you know more formal rules, so stay tuned!Details will be publicly announced during the opening ceremony on Friday, 16th November 2018 in Gliwice.
But don’t wait and register today. Tons of cool challenges & prizes will be there!
Be prepared to receive a huge set of data (1GB+ large) and from there process portion of these files in order to first teach the machine learning model and then execute the calculations of the remaining data portion. Data can be in a form of text, csv files, images, audio files, videos and more.
Consider to know or learn Tensorflow, Keras, Torch, R, Computer Vision, CNN networks and more.
Scikit-learn was made to provide an easy-to-use interface for developers to use off-the-shelf general-purpose machine learning algorithms for both supervised and unsupervised learning.
Scikit-Learn provides functions that apply classic machine learning algorithms like support vector machines logistic regressions and k nearest neighbour very easily but the one type of machine learning algorithm he doesn’t let you implement is the neural network: it doesn’t provide GPU support that is fundamental to train the most recent Deep Learning algorithms.
Keras is a high-level neural networks API, written in Python and capable of running on top of TensorFlow, CNTK, or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research.
Tensorflow is an open source software library for high performance numerical computation. Its flexible architecture allows easy deployment of computation across a variety of platforms (CPUs, GPUs, TPUs), and from desktops to clusters of servers to mobile and edge devices. Originally developed by researchers and engineers from the Google Brain team within Google’s AI organization, it comes with strong support for machine learning and deep learning and the flexible numerical computation core is used across many other scientific domains.
NLTK is a leading platform for building Python programs to work with human language data. It provides easy-to-use interfaces to over 50 corpora and lexical resources such as WordNet, along with a suite of text processing libraries for classification, tokenization, stemming, tagging, parsing, and semantic reasoning, wrappers for industrial-strength NLP libraries.
Natural Language Processing with Python provides a practical introduction to programming for language processing. Written by the creators of NLTK, it guides the reader through the fundamentals of writing Python programs, working with corpora, categorizing text, analyzing linguistic structure, and more. The online version of the book has been been updated for Python 3 and NLTK 3.3.
Gensim is a free Python library designed to automatically extract semantic topics from documents, as efficiently (computer-wise) and painlessly (human-wise) as possible.
Once these statistical patterns are found, any plain text documents (sentence, phrase, word…) can be succinctly expressed in the new, semantic representation and queried for topical similarity against other documents (words, phrases…).
With over 6 million users, the open source Anaconda Distribution is the fastest and easiest way to do Python and R data science and machine learning on Linux, Windows, and Mac OS X. It’s the industry standard for developing, testing, and training on a single machine.
Anaconda Enterprise is an AI/ML enablement platform that empowers organizations to develop, govern, and automate AI/ML and data science from laptop through training to production. It lets organizations scale from individual data scientists to collaborative teams of thousands, and to go from a single server to thousands of nodes for model training and deployment.
Use TensorFlow.js model converters to run pre-existing TensorFlow models right in the browser or under Node.js.