You have been asked by your IT manager to write a technical report on Artificial
ID: 3749634 • Letter: Y
Question
You have been asked by your IT manager to write a technical report on Artificial Intelligence with recommendations about how the team can use the technology (about 1500 words excluding bibliography). You have been pointedly reminded that some examples of working code are required and some useful instructions for setting up a test environment (discuss CPU versus GPU environments). Please note that a lot of research environments use LINUX - this means UBUNTU for you. The most used languages for AI are Python and R. Please keep to Python. While stuff does run on Windows 10, it is a bit clunky (in my view).
In your report you must show correct in-text citations using the accepted NSI standard (see learning pack). The submission will be checked for similarity and you must be below 20% for the similarity check. You must provide a supported definition of Artificial Intelligence with an explanation of machine learning as opposed to Deep Learning. You must cover supervised versus unsupervised learning and how to evaluate and improve AI models. You should explain the various types of approaches (algorithms) for various types of problems (e.g. neural networks, decision trees and so on). Give working examples of code. Talk about managing text data and discuss how to use algorithm chaining and pipelines. How do people test their algorithms (look at the Kaggle web site)? The assignment is really quite open-ended. There are no formal reviews. It is in your interests (assuming you want to do this assignment) to show and discuss it with me before submitting it.
Turnitin: This report will check for similarities with on-line publications. This assignment will help you get attributions correct, using quotes and so on. Turnitin will check for similarities between your work and other students. Between your work and publications and all other work that is submitted internationally to Turnitin.
Normally, at a post-graduate level, the similarity count must be below 1 or 2%. Seeing as though you are undergraduates, the similarity count should be below 20%. I will check all the similarity reports. Since this is a learning process for you, I will tell you what you need to do to fix the report. If you have a 20% score at the close-off date, you will get not get any bonus points for this assignment.
URL's:
See week 3 about an URL to a tutorial on using TensorFlow with Python. Tensor Flow is at https://www.tensorflow.org/
Keras is one of the most popular libraries for Deep Learning (using Python). Here is a tutorial https://www.datacamp.com/community/tutorials/deep-learning-python
SCIKIT-LEARN home and general introduction to Machine Learning. http://scikit-learn.org/stable/index.html
SCIKIT-LEARN is a very good collection of tools, algorithms and tutorials on Machine Learning in general. This is a good URL for various tutorials and other information http://scikit-learn.org/stable/tutorial/index.html.
You will also want to look at the concept of Gradient boosted tress (Muller & Guido pages 220-24) https://www.datacamp.com/community/tutorials/xgboost-in-python
The learning pack of the course has a video tutorial on Machine Learning (see the learning pack URL at the course introduction).
Books:
There are many books on Artificial Intelligence, most are concerned with the mathematics of the AI algorithms. I want you to focus your concern on how we can use these algorithms to solve real-world problems. Here are two books I think are good reads:
Introduction to Machine Learning with Python by Andreas C.Muller and Sarah Guido published by O'reilly. (They use SCIKIT-LEARN).
Deep Learning with Python by Francois Chollet published by Manning (Neural Networks for the main part and Tensor Flow).
Kaggle is the website for open competitions in AI. Please have a browse and this is "where it's at". https://www.kaggle.com/competitions
Staying up-to-date is a challenge particularly in AI. These days it can take several months to 2-3 years to get a research arcticle published, Archives such as https://arxiv.org/help/general are used by many authors to get research and latest advances into circulation. There is a wealth of material here so it's worthwhile to have a look.
This is an eBOOK courtesy of the library. You will have to login with your student ID. It is an Introduction to Deep Learning and has some more mathematics in it, so treat it as an advanced book. I suggest you download the PDF format to make it more portable. Please note that this is a one-person-one-copy download. http://www.nsi.tafensw.edu.au/libraries/seb/Springer%20Redirect%20Pages/2018_Introduction_to_Deep_Learning42.html
This is an eBOOK from the library: "Practical Machine Learning with Python: A Problem-Solver's Guide to Building Real-World Intelligent Systems" http://www.nsi.tafensw.edu.au/libraries/seb/Springer%20Redirect%20Pages/2018_Practical_Machine_Learning_with_Python71.html
Some Notes.
There has been much hype about AI since the 1950's with predictions about how artificial intelligence will soon be equivalent to human intelligence. These predictions have all been wrong. We do not have a clear science based understanding of intelligence or for that what is learning. So you need to be very skeptical especially in this assignment - I want your thoughts and reasoning based on researched science articles - not silly misconceptions from the popular press.
"Deep Learning" (you will need a cited definition of this term) has, according to Chollet's book on page 11 (see above) achieved much in the last few years including:
Near-human level image classification.
Near-human level speech recognition.
Near-human level handwriting transcription.
Improved machine translation.
Improved text-to-speech conversion.
Near-human-level autonomous driving.
Improved ad targeting as used by Google and Bing.
Improved search results on the web.
Ability to answer natural-language questions.
Superhuman Go playing.
Explanation / Answer
You can edit the following info and use based on your requirements --- >>
Technical Report on Artificial Intellignece :::
Part-01 [ What is AI ]
Artificial Intelligence is a research area where machines are designed to learn themselves without beign explicityly programmed .
This means the machines can think on its own with the different set of instructions that are used and given with the help of different ML and DL algorithms.
This is a simple meaning of Artificial Intelligence.
The main aim of AI is to create intelligent machine that can act like a human.
Whereas Machine Learning and Deep Learning are the techniques used to implement AI to machines.
ML and Deep Learning are the integral parts of Artificial Intelligence.
Without ML and Deep Learning no machine can be Intelligent.
This is the brief intro about AI
Machine Learning :::
This is a sub-field of AI , which focus mainly on large amounts of data and uses different algorithms to process the data and grabs an appropriate solution with much accuracy. Different algorithms are used to obtain solutions with different accuracies.
The main agenda behind ML is to make the machine learn with the help of large data that is provided and the machine is trained to acquire some knowledge with the help of data. With this knowledge the machine can predict the future events. This is how machine learning is done.
Some of the most commonly used ML algorithms are Linear Regression , Logistic Regression , Random Forest , KNN , XG Boost etc ...
Deep Learning :::
Deep Learning is also a class of Machine Learning. This uses most of ML algorithms but in a different manner in the form of different layers. For e.g., the Convolutional Neural Network (CNN), Long-Short Term Memory (LSTM) network, are deep learning algorithms. You train them using data, just like machine learning algorithms.
Supervised Learning Algorithms ::
In supervised learning, we are given a data set and already know what our correct output should look like, having the idea that there is a relationship between the input and the output.
Some of the most common examples are predicting house prices and building Recommendation Systems etc...
Unsupervised Learning Algorithms ::
Unsupervised learning, allows us to approach problems with little or no idea what our results should look like. We can derive structure from data where we don't necessarily know the effect of the variables.
Clusturing is the most commonly used method in this approach
Some examples are Hand Written digits Recognition etc ,,,,
Part - 02 [ How to setup a AI environment ]
[ Hardware ]
Environment is an area where one can run their code. Different processes require different environments. For example if you want to run a C- Program you need have a compiler. This compiler helps the program to execute. In the similar manner If you want to develop an application with AI techniques you need to build an environment. Generally it takes a lot of time and money to build AI environment as it deals with high end processros with high amount of RAM , GPU 's and many more.
Here are the basic requirements to set up a AI machine that is used to develop AI Applications . [ For personal Use only]
RAM:: You need have large amount of RAM to process large amount of data [ You require a minimum of 8GB to 16GB of RAM ]
Processor :: You need have processor that can work faster [ At Minimum you require Intel I7 processor ]
SSD and HDD :: 500 GB of SSD and HDD is upto you requirement
GPU :: GPU is highly required to process your training data faster [ Minimum of 2 GPU required for faster processing ]
These are the main requirements apart from this we need some other basic parts..
Insted of doing all this if you require these services for a limited amount of time you can use cloud service providers which provide GPU with some cost.
If you want are in a learning stage you may use GOOGLE COLLAB which provide free GPU service. // Preferred mostly for students w
[ Software ]
It's better to use UBUNTU when compared to any other operating systems because it provides the easier ways to install all dependencies that are required to deploy a AI software. Most of the developers suggest UBUNTU rather than the other OS because of its simplicity and ease to setup the required dependencies .
Coming to Programming Languages Python and R are the mostly used languages to develop AI applications. Python is the best one among the two. Python provides many buildin libraries that make ML and DL algorithms work simpler. For example you can use scikit - learn Library which makes scientific calculations easier , pandas to easily analyze data by stroing it in dataframes , matplotlib and seaborn to plot data and show in a graphical manner and many other libraries. So Python is the most preferred language to deploy AI applications .
Different ML Libraries ::
TENSORFLOW ::
Many revolutions came in the field of AI after the development of this OpenSource Library by Google. This library took the level of object detection to the next stage. Many Object Detection models have been deployed and opensourced with the help of this library. This made developers to work easily.
KERAS ::
This is a Deep Learning Library which helps to develop DeepLearning Applications easily for beginners. Mamy functionalities of Deep Learning can be easily implemented with the help of Keras. Similary there are some other libraries like Caffe etc which supports deep Learning.
SCIKIT - LEARN ::
sk-learn is used to implement ML algorithms easily with its inbuild functions. This library makes the developer to utilize the functionalities of the algorithms upto maximum extent as they are easily implemented. A sample Working example is given below which uses sk-learn library..
Evaluation and Improvements ::
As there are many algorithms and different libraries the end result of any application is its ACCURACY. This shows how well the algorithm is performing . Inititally the algorithms are trained with a set of data and again tested with new data to check accuracy which is called evaluation and if required many improvements can be done to the application if the Accuracy is Low. In this manner Applications are initially tested and further deployed.
A sample Working code ::: Predicting House Prices
Here are basic steps you need to follow to predict the house rates for a given dataset
1. Taka a dataset or download any dataset available online.
2. Store the dataset by using pandas.
If you want to assign some data in a ''csv file'' to a variable use the following
eg::code::
import pandas as pd
data = pd.read_csv('/home/Desktop/kaggle/housing_data.csv') // give the exact path of the csv file //
If you want to read some data in an ''Excel File '' to a variable use the following
eg::code::
import pandas as pd
data=pd.read_excel('/home/Desktop/kaggle/housing_data.csv') // give the exact path of the csv file //
Similarly
pd.read_json('/path') // For reading json files
pd.read_hdf('/path//file.hf') // For reading hdf files
3. Once your data is stored you need to know the structure of the data. Here is the Simple code to find it
structure_of_data = data.describe() // This gives the structure of data
4. If you partiularly want to extract any column in the dataframe for analysis of the data you need to use the following code
data.columns() //gives the details of the columns
data.column_name() //to get access to a particular column
To get access to multiple columns at the same time we use the following
selected_cols = ['no_of_bedrooms' , 'size_of_house' , 'age_of_house' ] // selecting required cols
data_selected = data[selected_cols] // printing data in the columns
data_selected.describe() // Structure of selected data
Predicting model with the help of scikit learn --
1. What are trying to predict from the given dataset // In our case HousePrice from the housing dataset
2. Assign the output -- Prediction Target as y
y=data.HousePrice // Takes the data in the HousePrice column into y
3. Now choosing the input values -- Predictors inorder to predict the output -- Prediction Target HousePrice with the help of some Predictors
Predictors=['size' , 'no_of_bedrooms', 'age_of_house']
X = data[Predictors]
4. Now We partitioned our data into output and input c
Here we are going to use sklearn library to analyze the data and apply some machine learning algorithms to predict house prices
5. We use some terms in this process namely ...
Define: What type of model will it be? A decision tree? Some other type of model? Some other parameters of the model type are specified too.
Fit: Capture patterns from provided data. This is the heart of modeling.
Predict: Here the Output part Prediction Comes
Evaluate: Determine how accurate the model's predictions are.
6. Sample Code to implement our model in scikit learn
from scikit-learn.tree import DecisionTreeRegressor //Importing DecisionTreeRegressor Algorithm
house_model = DecisionTreeRegressor()
house_model.fit = (X,y)
7. Here We are starting our work --> Predicting House Prices of the First 5 Houses
print("We are predicting the house prices of the first 5 houses")
print(X.head())
print("The predictions are :: ")
print(house_model.predict(X.head()))
//Some output here //
If you need any further information
-- bang -- Comment
Thanks
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.