in java create the above. Use only binary tree. code just compile and run All nu
ID: 3737986 • Letter: I
Question
in java create the above. Use only binary tree. code just compile and run
All numbers must be round the nearest hod number. Don't is count as one word anything with ' counts as one word
Word: Sequence of letters ending in a blank, a period, an exclamation point, a question mark, a colon, a comma, a single quote, or a semicolon. You may assume that numbers do not appear in the words; they may be ignored. Distinct word: Words that are spelled the same, ignoring uppercase and lowercase distinctions. Sentence: A set of words followed by a period, an exclamation point, a question mark.
Given a file containing text analyze it and produce statistics about its contents. The software will accept the name of the file containing the text to be analyzed and will display the following output INPUT FILE NAME: STATISTICAL SUMMARY TOTAL NUMBER OF NON-BLANK CHARACTERS: TOTAL NUMBER OF WORDS: TOTAL NUMBER OF SENTENCES: AVERAGE WORD LENGTH (in characters): AVERAGE SENTENCE LENGTH (in words): TOTAL NUMBER OF DISTINCT WORDS: TOTAL NUMBER OF DISTINCT WORDS OF MORE THAN THREE LETTERS: INDEX OF DISTINCT WORDS AND THEIR POSITION(S) al 7,17 and 3,9,14,26 around be because butExplanation / Answer
ANS:-
Given that,
Word: Sequence of letters ending in a blank, a period, an exclamation point, a question mark, a colon, a comma, a single quote, or a semicolon. You may assume that numbers do not appear in the words.
Explanation:
Please find the below StatisticalSummary class and sample output.
Please check and revert back in case anything needs to be changed.
Program:
import java.io.File;
import java.io.FileNotFoundException;
import java.util.HashMap;
import java.util.Map;
import java.util.Scanner;
import java.util.TreeMap;
public class StatisticalSummary {
public static void main(String[] args) {
Scanner userInput = new Scanner(System.in);
System.out.print("INPUT FILE NAME: ");
String fileName = userInput.nextLine();
analyzeInputFile(fileName);
userInput.close();
}
public static void analyzeInputFile(String inputFileName) {
Scanner sacnnerReader = null;
try {
File file = new File(inputFileName);
if (file.exists()) {
String allLine = "";
sacnnerReader = new Scanner(file);
while (sacnnerReader.hasNextLine()) {
String line = sacnnerReader.nextLine();
allLine = allLine + line + " ";
}
int charCount = 0;
for (int i=0;i<allLine.trim().length();i++) {
char ch = allLine.charAt(i);
if (ch != ' ' && ch != ' ' && ch != ' ' && ch != ' ') {
charCount++;
}
}
String wordArray [] = allLine.split("\s+");//Groups all white spaces as a delimiter.
int wordCount = 0;
int allWordLength = 0;
HashMap<String, Integer> distinctWordMap = new HashMap<String, Integer>();
for (int i=0;i<wordArray.length;i++) {
String word = wordArray[i].trim();
word = word.replaceAll(",", "");
word = word.replaceAll("\.", "");
word = word.replaceAll(";$", "");
wordCount++;
allWordLength = allWordLength + word.length();
if (distinctWordMap.get(word) != null) {
distinctWordMap.put(word, distinctWordMap.get(word)+1);
} else {
distinctWordMap.put(word, 1);
}
}
TreeMap<String, Integer> sortedDistinctWordMap = new TreeMap<String, Integer>(String.CASE_INSENSITIVE_ORDER);
sortedDistinctWordMap.putAll(distinctWordMap);
int threeLetterDistinctWordCount = 0;
for (Map.Entry<String, Integer> entry : sortedDistinctWordMap.entrySet()) {
if (entry.getKey() != null && entry.getKey().length()>3) {
threeLetterDistinctWordCount++;
}
}
String sentenceArray[] = allLine.split("\.");//Groups all dot as a delimiter.
int sentenceCount = sentenceArray.length;
double averageWordLength = (double)charCount/wordCount;
double averageSentenceLength = (double)allWordLength/sentenceCount;
System.out.println(" STATISTICAL SUMMARY ");
System.out.println("TOTAL NUMBER OF NON-BLANK CHARACTERS: "+charCount);
System.out.println("TOTAL NUMBER OF WORDS: "+wordCount);
System.out.println("TOTAL NUMBER OF SENTENCE: "+sentenceCount);
System.out.println("AVERAGE WORD LENGTH (in characters): "+String.format("%.2f", averageWordLength));
System.out.println("AVERAGE SENTENCE LENGTH (in word): "+String.format("%.2f", averageSentenceLength));
System.out.println("TOTAL NUMBER OF DISTINCT WORDS: "+distinctWordMap.size());
System.out.println("TOTAL NUMBER OF DISTINCT WORDS OF MORE THAN THREE LETTERS: "+threeLetterDistinctWordCount);
System.out.println(" INDEX DISTINCT WORDS AND THEIR POSITIONS");
char lastChar = ' ';
for (Map.Entry<String, Integer> entry : sortedDistinctWordMap.entrySet()) {
char currentChar = entry.getKey().toUpperCase().charAt(0);
if (currentChar == lastChar) {
System.out.printf("%-25s %-10s",entry.getKey(),String.valueOf(entry.getValue()));
System.out.println();
} else {
System.out.println();
System.out.println(entry.getKey().toUpperCase().charAt(0));
System.out.printf("%-25s %-10s",entry.getKey(),String.valueOf(entry.getValue()));
System.out.println();
lastChar = entry.getKey().toUpperCase().charAt(0);
}
}
} else {
System.out.println("Input File not found");
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
} finally {
if (sacnnerReader != null) {
try {
sacnnerReader.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}
}
}
Output:
INPUT FILE NAME: input.txt
STATISTICAL SUMMARY
TOTAL NUMBER OF NON-BLANK CHARACTERS: 292
TOTAL NUMBER OF WORDS: 53
TOTAL NUMBER OF SENTENCE: 5
AVERAGE WORD LENGTH (in characters): 5.51
AVERAGE SENTENCE LENGTH (in word): 57.00
TOTAL NUMBER OF DISTINCT WORDS: 43
TOTAL NUMBER OF DISTINCT WORDS OF MORE THAN THREE LETTERS: 30
INDEX DISTINCT WORDS AND THEIR POSITIONS
A
a 3
and 2
approaches 1
as 1
B
by 1
C
complete 1
D
developed 1
G
gives 1
H
high 1
I
is 1
J
Java 4
L
language 2
learning 1
level 1
M
Mac 1
Microsystems 1
O
of 3
on 1
originally 1
OS 1
P
platforms 1
practical 1
Programming 1
R
reference 1
runs 1
S
simple 1
such 1
Sun 1
T
take 1
the 1
This 2
through 1
tutorial 1
U
understanding 1
UNIX 1
V
variety 1
various 1
versions 1
W
while 1
will 1
Windows 1
Y
you 1
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.