1) Implement -r 2) use a ring buffer (thereby avoiding any buffer overflow issue
ID: 3547445 • Letter: 1
Question
1) Implement -r
2) use a ring buffer (thereby avoiding any buffer overflow issues)
3) use only pointer notation (avoid a[i] notation)
Write a C program that will read data one
character at a time and output the results to standard output. Roughly,
your program should mostly copy the input stream to the output stream,
but it should delete all non-ASCII characters, 'make visible' the
'non-printing characters' from the input, but print only the LAST
72 characters of each line as it writes out the (new and improved) data.
Specifically, the changes that should be made to the incoming data are:
1) remove all non-printing characters (character codes 0 through 31,
and 127) and replace them with the same sort of codes the vi
editor uses. For example, the bell character (character code 7)
is replaced by the two printable characters "^" and "G" (G is the seventh
letter of the alphabet). Most of these are shown under the 'Ctrl char'
column of http://www.physics.udel.edu/~watson/scen103/ascii.html
If you have a copy of the Foster "C By Discovery" text (highly recommended,
and dirt cheap if you buy an older edition), Appendix-I also lists the ASCII
character set, and it shows all the proper two-character sequences that
should be used for the non-printing characters. There are two special cases:
Newlines should be replaced by a dollar sign ("$") and then an actual
newline. Replace the delete character by "^?" (Actually, if you use modular
arithmetic, the delete character does not have to be a special case.)
2) delete all non-ASCII characters (those with character codes 128 through 255).
3) once you have (internally) made the above changes, print only the LAST 72
[non-newline] characters (followed by a newline). Thus, a single Control-G
in the input line will account for two of those 72 characters ("^" and "G").
Regardless of the length of the line, the last visible character will
be a "$" -- except in the one (weird) case where the last 'line' of the
file does not end in a newline; in that case, there will be no "$" at
the end of that 'line' (and no newline, either).
4) Your p4 should also pay attention to command-line parameters. Invoked as:
p4
...your program MUST read from stdin, but if invoked as:
p4 somefilename
...then your program MUST instead fopen() "somefilename" for reading, and then
use getc() on this new filestream (in much the same way as you used getchar()
when reading from stdin). Indeed, getchar() is actually defined as getc(stdin),so it makes sense to just use getc() in every case.
Explanation / Answer
1) Implement -r
Serialization isn't just a Java concept, it's supported in most high-level languages. It's the process of converting an in-memory object to a formatted series of bytes to be persisted (stored on disk) or sent across a network.
You might ask yourself "Why can't a receiving PC just copy the object from the network, straight into my Java program's memory?" The formatting must occur because hardware platforms may use different different byte orderings, memory layouts, etc.
If I have an x86 CPU (MacBook) that is little-Indian and in my Java program I send an instance of String over the network to a PowerPC CPU (Power Mac G4) without serialization, the string will be interpreted backwards. Or even, if in your Java program you want to store a String object to a file, the value will be interpreted backwards when you try to read it from the file.
Little Indian is a byte ordering in memory: least significant byte first. The number 1024 represented in binary is on a little-inidian CPU is:
00000000 00000100
On a Big Indian (most significant byte first) CPU it's:
00000100 00000000
Note that the normal representation of 1024 in binary is:
00000100 00000000
Obviously, if you try and just copy the data directly as it came in over the network from the MacBook, you end up with an integer value of 4 instead of 1024 in your Java program.
The Serializable interface in Java, denotes the object needs to be formatted before being persisted or sent across the network. Any object that is persisted or sent over the network must be serializable. You don't need to implement any methods, it's just necessary to let the runtime to know, serialization needs to be done. Many of the classes in the Java library implement the interface, so it's not necessary.
//Write an in-memory object to the file system
ObjectOutputStream oos = new ObjectOutputStream(new FileOutputStream("<file>"));
////////
oos.writeObject(new String("1")); //String is already serializable, great!
////////
///////
class Foo{}
oos.writeObject(new Foo()); //No, needs to be serialized
//////
/////
class Foo implements Serializable{}
oos.writeObject(new Foo()); //OK
////
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.