Skip to content

Added my small project -- Social_Network_Ads #61

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 28 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
6951590
Create Full_Connection.md
uc-creat Oct 14, 2020
e73c3fe
Update Full_Connection.md
uc-creat Oct 14, 2020
703cfd4
Update Full_Connection.md
uc-creat Oct 14, 2020
3c474fc
Update Full_Connection.md
uc-creat Oct 14, 2020
75cd12d
Update Full_Connection.md
uc-creat Oct 14, 2020
988c0fb
Update Full_Connection.md
uc-creat Oct 14, 2020
70d2e69
Update Full_Connection.md
uc-creat Oct 14, 2020
86f11b3
Update Full_Connection.md
uc-creat Oct 14, 2020
111d1be
Update Full_Connection.md
uc-creat Oct 14, 2020
55c16c6
Update Full_Connection.md
uc-creat Oct 14, 2020
56cc5e2
Update Full_Connection.md
uc-creat Oct 14, 2020
a091d82
Update Full_Connection.md
uc-creat Oct 14, 2020
5a25ec4
Update Full_Connection.md
uc-creat Oct 14, 2020
c350d71
Add files via upload
uc-creat Oct 14, 2020
dc934a2
Update convolutional_neural_network.ipynb
uc-creat Oct 14, 2020
b5efdf5
Rename convolutional_neural_network.ipynb to Full_Connection.ipynb
uc-creat Oct 14, 2020
ca17e7a
Update Readme.md
uc-creat Oct 14, 2020
79d7051
Create Description.md
uc-creat Dec 5, 2020
e845818
Add files via upload
uc-creat Dec 5, 2020
4a3dd8e
Delete Social Network Ads.ipynb
uc-creat Dec 5, 2020
8a7d02b
Add files via upload
uc-creat Dec 5, 2020
34ac60c
Delete social_network_ads.ipynb
uc-creat Dec 5, 2020
d87038b
Add files via upload
uc-creat Dec 5, 2020
8a05d37
Add files via upload
uc-creat Dec 5, 2020
05cd264
Update Description.md
uc-creat Dec 5, 2020
bca7d46
Update Description.md
uc-creat Dec 5, 2020
6d403d7
Update Readme.md
uc-creat Dec 5, 2020
a06687f
Update README.md
uc-creat Feb 27, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
607 changes: 607 additions & 0 deletions (t6) Convolutional Neural Networks/Full_Connection.ipynb

Large diffs are not rendered by default.

65 changes: 65 additions & 0 deletions (t6) Convolutional Neural Networks/Full_Connection.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
# Artificial Neural Networks (ANN):

Artificial neural networks (ANNs) are biologically inspired computer programs designed to simulate the way in which the human brain processes information.
To understand this better -
Consider the figure below:

![Neuron](https://upload.wikimedia.org/wikipedia/commons/8/86/1206_The_Neuron.jpg)

The figure represents a neuron of a human brain. Human brain learns through experience. This learning is done, by adjusting certain events and giving them priority.
This process of giving priority to any event is done by giving weights to each of the event, and these weights are then passed to other neurons through a phenomenon called
synapse.
Similiar to the above concept, ANN works.
Consider the figure below:

![ANN](https://groupfuturista.com/blog/wp-content/uploads/2019/03/Artificial-Neural-Networks-Man-vs-Machine-735x400.jpeg)

There are 3 layers,
* Input Layer - takes the input
* Hidden Layer - adjust the weights/ learning phase
* Output Layer - displays the result

### Working of an ANN:
ANNs gather their knowledge by detecting the patterns and relationships in data and learn (or are trained) through experience, not from programming. An ANN is formed from
hundreds of single units, artificial neurons or processing elements (PE), connected with coefficients (weights), which constitute the neural structure and are organised in
layers. The power of neural computations comes from connecting neurons in a network. Each PE has weighted inputs, transfer function and one output. The behavior of a neural
network is determined by the transfer functions of its neurons, by the learning rule, and by the architecture itself. The weights are the adjustable parameters and, in that
sense, a neural network is a parameterized system. The weighed sum of the inputs constitutes the activation of the neuron. The activation signal is passed through transfer
function to produce a single output of the neuron. Transfer function introduces non-linearity to the network. During training, the inter-unit connections are optimized until
the error in predictions is minimized and the network reaches the specified level of accuracy. Once the network is trained and tested it can be given new input information to
predict the output.

### Backpropagation - adjusting of the weights:
We calculate the total error at the output nodes and propagate these errors back through the network using Backpropagation to calculate the gradients. Then we use an
optimization method such as Gradient Descent to ‘adjust’ all weights in the network with an aim of reducing the error at the output layer.

![backpropagation](https://ujwlkarn.files.wordpress.com/2016/08/screen-shot-2016-08-09-at-11-53-06-pm.png?w=748)

Suppose that the new weights associated with the node in consideration are w4, w5 and w6 (after Backpropagation and adjusting weights).
If we now input the same example to the network again, the network should perform better than before since the weights have now been adjusted to minimize the error in
prediction. As shown in Figure 7, the errors at the output nodes now reduce to [0.2, -0.2] as compared to [0.6, -0.4] earlier. This means that our network has learnt to
correctly classify our first training example.

![adjusting_weights](https://ujwlkarn.files.wordpress.com/2016/08/screen-shot-2016-08-09-at-11-53-15-pm.png?w=748)

We repeat this process with all other training examples in our dataset. Then, our network is said to have learnt those examples.


### General view of ANN:

![flowchart](https://miro.medium.com/max/1168/0*ZJtto33Yo-gc4xPa.png)


# Full Connection:
Full connection represent the overall connection for the Convolutional Neural Networks - (CNN + ANN)
Consider an example of predicting whether a car is present in an image or not. For this purpose we train our Convolutional layer and reduce its dimenstions by max pooling
followed by flattening. Now this reduction of dimensions is done so that, ANN can take that as an input.
ANN cannot take an input directly in the form of an image, so first we have to reduce its dimensions.
After Flattening of the image, we get a form in which ANN can take the input.
At the end we can also apply **softmax function**, in order to get the output in the form of binary, and hence it reduces the loss.

![Full_connection](https://1d-cnn.hostforjusteasy.fun/img/794152cb48c9bc9774e72bf7c0d6366c.png)

The full connection can be seen in the above figure.

---
2 changes: 1 addition & 1 deletion (t6) Convolutional Neural Networks/Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,6 @@
| --------- | ---------------- | ---------- |
| **1) Convolutional Layer:**<br>a. Convolution Operations<br>b. What are filters<br>c. What is convolution Layer<br>d. Need and working of convolution layer<br>e. ReLU Layer | Ashish Kumar Panigrahy | |
| **2) Max Pooling and Flattening** <br>a. What is pooling<br>b. Need for pooling<br>c. Types of pooling<br>d. Working of Max pooling<br>e. What is flattening<br>f. Working of flattening | Pooja Thakkar | |
| **3) Full connection** <br>a. Brief working of ANN/DNN<br>b. Concept of Backpropagation<br>c. Full working fo the entire connection-CNN+ANN | Utkarsh Chauhan |
| **3) Full connection** <br>a. Brief working of ANN/DNN<br>b. Concept of Backpropagation<br>c. Full working fo the entire connection-CNN+ANN | Utkarsh Chauhan | Completed

#### Mentor: Om Rastogi
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Contribution-program
Repository for competition in open-source contribution under DevIncept.

### Follow these steps to make a contribution:
### Follow these

1. Fork this repository.

Expand Down
3 changes: 3 additions & 0 deletions intern-basics/Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,3 +28,6 @@ Part4: Basic functions in OpenCV Notebook

Part5: Simple sketching program- Trivedh



Social-Network-Ads ---- Utkarsh Chauhan
22 changes: 22 additions & 0 deletions intern-basics/Social_Network_Ads/Description.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# Social Network ads

Machine Learning technology in social media allows machines to decide which advertisements are to be shown to which audience. They collect data from
users, analyze it, find out their preferences and accordingly show advertisements which hold their interest


So, in this projet we aim, to get an idea as which customer or how many customers will buy an item like a car, if we advertise our item (i.e. car in this case).
The data is collected of how many customers bought the car, and how many did not.
Along with this information of buying the car and not buying the car, Age and estimated salary of the customer is also collected.

So, in this case, we consider two parameters as the basis to predict whether a customer will buy a car or not, and those two parameters are -

1. Age
1. Estimated Salary

So, first we import few python libraries and import the dataset.
Then we split the dataset into train and test.

Now, because we will be using kernal SVM, therefore, we need to scale our parameters; as the range of all features should be normalized so that each feature contributes approximately proportionately to the final distance.

After feature scaling we apply our model on the dataset, and predict the outcomes.

Loading