***** object oriented programming java Go to finance.yahoo.com. Download data fo
ID: 3830553 • Letter: #
Question
***** object oriented programming
java
Go to finance.yahoo.com. Download data for a stock of your choice.
Data download: type in the stock symbol, then go to historical data, select starting date and end date (01/01/2006 to 12/31/2016).
save ot as a csv file (comma separated values)
Questions:
Use Adjusted close for all computations
Q1: Compute the 20 days moving average: avg(n) = avg(n-1)*factor + close of today*(1-factor)
Q2: Compute the standard deviation at each day.
Take your time and please give me a accurate answer please...
use this,,,
***** i have an answer ,,, please check... and try to give me the whole answer.
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
public class Stocks {
static double values[] = new double[100];
static double averages[] = new double[100];
static int no_of_values = 0;
public static void main(String[] args) throws NumberFormatException,
IOException {
BufferedReader reader = new BufferedReader(new FileReader(
"stock_data.csv"));
String line;
reader.readLine(); // to ignore first line of comments;
// red data.. and store in values array
while ((line = reader.readLine()) != null) {
String tokens[] = line.split(",");
values[no_of_values++] = Double.parseDouble(tokens[4]);
}
if (no_of_values > 0) {
averages[0] = values[0];
// considering a factor of 0.5 in moving average
for (int i = 1; i < no_of_values; i++) {
averages[i] = averages[i - 1] + values[i] ;
}
System.out.println("Moving Avergage is: "
+ averages[no_of_values - 1]);
// get the overall average
double average = 0;
for (int i = 0; i < no_of_values; i++) {
average += values[i];
}
average = average / no_of_values;
// find the deviation for each day w.r.t Average of 20 days
double squaredDeviation = 0;
for (int i = 0; i < no_of_values; i++) {
squaredDeviation = Math.pow(values[i] - average, 2);
System.out.println("Standard Deviation for Day " + (i+1) + " is: " + Math.sqrt(squaredDeviation));
}
}
reader.close();
}
}
stock_data.csv:
Date,Open,High,Low,Close,Volume,Adj Close
4/3/2017,143.710007,144.119995,143.050003,143.699997,19967200,143.699997
3/31/2017,143.720001,144.270004,143.009995,143.660004,19534100,143.660004
3/30/2017,144.190002,144.5,143.5,143.929993,21179100,143.929993
3/29/2017,143.679993,144.490005,143.190002,144.119995,29120900,144.119995
3/28/2017,140.910004,144.039993,140.619995,143.800003,33320700,143.800003
3/27/2017,139.389999,141.220001,138.619995,140.880005,23493200,140.880005
3/24/2017,141.5,141.740005,140.350006,140.639999,22025300,140.639999
3/23/2017,141.259995,141.580002,140.610001,140.919998,20285700,140.919998
3/22/2017,139.850006,141.600006,139.759995,141.419998,25787600,141.419998
3/21/2017,142.110001,142.800003,139.729996,139.839996,39116800,139.839996
3/20/2017,140.399994,141.5,140.229996,141.460007,20213100,141.460007
3/17/2017,141,141,139.889999,139.990005,43597400,139.990005
3/16/2017,140.720001,141.020004,140.259995,140.690002,19132500,140.690002
3/15/2017,139.410004,140.75,139.029999,140.460007,25566800,140.460007
3/14/2017,139.300003,139.649994,138.839996,138.990005,15189700,138.990005
3/13/2017,138.850006,139.429993,138.820007,139.199997,17042400,139.199997
3/10/2017,139.25,139.360001,138.639999,139.139999,19488000,139.139999
3/9/2017,138.740005,138.789993,137.050003,138.679993,22065200,138.679993
3/8/2017,138.949997,139.800003,138.820007,139,18681800,139
3/7/2017,139.059998,139.979996,138.789993,139.520004,17267500,139.520004
3/6/2017,139.369995,139.770004,138.600006,139.339996,21155300,139.339996
OUTPUT:
take your time, no rush.
please give me a correct and whole answer,,, i promise, i will give you thumbs up,, thank you.
4. Find the reflexive closure, symmetric closure, and transitive closure of above relation R.Explanation / Answer
Hi,
Please see below the updated class. Please comment for any queries/feedbacks.
Thanks,
Stocks.java
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
public class Stocks {
static double values[] = new double[100];
static double averages[] = new double[100];
static int no_of_values = 0;
public static void main(String[] args) throws NumberFormatException,
IOException {
BufferedReader reader = new BufferedReader(new FileReader("stock_data.csv"));
String line;
reader.readLine(); // to ignore first line of comments;
double factor= 0.5;
// red data.. and store in values array
while ((line = reader.readLine()) != null) {
String tokens[] = line.split(",");
values[no_of_values++] = Double.parseDouble(tokens[4]);
}
if (no_of_values > 0) {
averages[0] = values[0];
// considering a factor of 0.5 in moving average
//Compute the 20 days moving average: avg(n) = avg(n-1)*factor + close of today*(1-factor)
for (int i = 1; i < no_of_values; i++) {
averages[i] = (averages[i - 1] * factor)+ (values[i] * (1 -factor) );
}
System.out.println("Moving Avergage is: "
+ averages[no_of_values - 1]);
// get the overall average
double average = 0;
for (int i = 0; i < no_of_values; i++) {
average += values[i];
}
average = average / no_of_values;
// find the deviation for each day w.r.t Average of 20 days
double squaredDeviation = 0;
for (int i = 0; i < no_of_values; i++) {
squaredDeviation = Math.pow(values[i] - average, 2);
System.out.println("Standard Deviation for Day " + (i+1) + " is: " + Math.sqrt(squaredDeviation));
}
}
reader.close();
}
}
stock_data.csv
4/3/2017,143.710007,144.119995,143.050003,143.699997,19967200,143.699997
3/31/2017,143.720001,144.270004,143.009995,143.660004,19534100,143.660004
3/30/2017,144.190002,144.5,143.5,143.929993,21179100,143.929993
3/29/2017,143.679993,144.490005,143.190002,144.119995,29120900,144.119995
3/28/2017,140.910004,144.039993,140.619995,143.800003,33320700,143.800003
3/27/2017,139.389999,141.220001,138.619995,140.880005,23493200,140.880005
3/24/2017,141.5,141.740005,140.350006,140.639999,22025300,140.639999
3/23/2017,141.259995,141.580002,140.610001,140.919998,20285700,140.919998
3/22/2017,139.850006,141.600006,139.759995,141.419998,25787600,141.419998
3/21/2017,142.110001,142.800003,139.729996,139.839996,39116800,139.839996
3/20/2017,140.399994,141.5,140.229996,141.460007,20213100,141.460007
3/17/2017,141,141,139.889999,139.990005,43597400,139.990005
3/16/2017,140.720001,141.020004,140.259995,140.690002,19132500,140.690002
3/15/2017,139.410004,140.75,139.029999,140.460007,25566800,140.460007
3/14/2017,139.300003,139.649994,138.839996,138.990005,15189700,138.990005
3/13/2017,138.850006,139.429993,138.820007,139.199997,17042400,139.199997
3/10/2017,139.25,139.360001,138.639999,139.139999,19488000,139.139999
3/9/2017,138.740005,138.789993,137.050003,138.679993,22065200,138.679993
3/8/2017,138.949997,139.800003,138.820007,139,18681800,139
3/7/2017,139.059998,139.979996,138.789993,139.520004,17267500,139.520004
3/6/2017,139.369995,139.770004,138.600006,139.339996,21155300,139.339996
Sample output:
Moving Avergage is: 139.29936316663935
Standard Deviation for Day 1 is: 2.876003699999984
Standard Deviation for Day 2 is: 3.1459926999999936
Standard Deviation for Day 3 is: 3.3359946999999863
Standard Deviation for Day 4 is: 3.0160027000000014
Standard Deviation for Day 5 is: 0.09600470000000882
Standard Deviation for Day 6 is: 0.14400130000001354
Standard Deviation for Day 7 is: 0.13599769999999012
Standard Deviation for Day 8 is: 0.6359976999999901
Standard Deviation for Day 9 is: 0.9440042999999889
Standard Deviation for Day 10 is: 0.6760066999999879
Standard Deviation for Day 11 is: 0.793995300000006
Standard Deviation for Day 12 is: 0.09399830000000975
Standard Deviation for Day 13 is: 0.32399330000001214
Standard Deviation for Day 14 is: 1.793995300000006
Standard Deviation for Day 15 is: 1.5840033000000062
Standard Deviation for Day 16 is: 1.6440013000000135
Standard Deviation for Day 17 is: 2.1040073000000064
Standard Deviation for Day 18 is: 1.7840003000000024
Standard Deviation for Day 19 is: 1.2639963000000023
Standard Deviation for Day 20 is: 1.444004299999989
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.