Made with MarkdownSlides https://github.com/asanzdiego/markdownslides: a script to create slides from MD files.
The program source code are licensed under a GPL 3.0
Java supported this programming paradigms:
Write once, run anywhere
Java is for developing cross-platform, general-purpose applications.
public class HelloWorld {
public static void main(String[] args) {
System.out.println("Hello World");
}
}
Gordon Moore, co-founder of Intel
This “receiving” section obtains information (data and computer programs) from input devices and places it at the disposal of the other units for processing.
This “shipping” section takes information the computer has processed and places it on various output devices to make it available for use outside the computer.
Memory, Primary Memory or RAM (Random Access Memory).
This rapid-access, relatively low-capacity “warehouse” section retains information that has been entered through the input unit, making it immediately available for processing when needed. The memory unit also retains processed information until it can be placed on output devices by the output unit. Information in the memory unit is volatile (lost when the computer’s power is turned off).
The Arithmetic and Logic Unit performs calculations, such as addition, subtraction, multiplication and division. It also contains the decision mechanisms that allow the computer, for example, to compare two items from the memory unit to determine whether they’re equal.
In today’s systems, the ALU is implemented as part of the next logical unit, the CPU.
The Central Processing Unit coordinates and supervises the operation of the other sections. The CPU tells the input unit when information should be read into the memory unit, tells the ALU when information from the memory unit should be used in calculations and tells the output unit when to send information from the memory unit to certain output devices.
A multicore processor implements multiple processors on a single integrated-circuit chip and, hence, can perform many operations simultaneously.
This is the long-term, high-capacity “warehousing” section.
Programs or data not actively being used by the other units normally are placed on secondary storage devices (e.g., your hard drive) until they’re again needed.
Information on secondary storage devices is persistent (it’s preserved even when the computer’s power is turned off).
Secondary storage information takes much longer to access than information in primary memory, but its cost per unit is much less.
Examples of secondary storage devices include solid-state drives (SSDs), hard drives, DVD drives and USB flash drives, some of which can hold over 2 TB (TB stands for terabytes).
A bit (short for “binary digit”) is the smallest data item in a computer (take value 0 or 1).
Data items processed by computers form a data hierarchy that becomes larger and more complex in structure as we progress from the simplest data items (bits) to richer ones, such as characters and fields.
It’s tedious for people to work with data in the low-level form of bits. Instead, they prefer to work with decimal digits (0-9), letters (A-Z and a-z), and special symbols (e.g., $, @, %, &, *, (, ), –, +, ", :, ? and /).
Digits, letters and special symbols are known as characters.
Computer’s character set: all the characters used to write programs and represent data items.
Computers process only 1s and 0s, so a computer’s character set represents every character as a pattern of 1s and 0s.
Java uses Unicode characters that are composed of one, two or four bytes (8, 16 or 32 bits).
Unicode contains characters for many of the world’s languages
ASCII (American Standard Code for Information Interchange) character set is the popular subset of Unicode that represents uppercase and lowercase letters, digits and some common special characters.
Just as characters are composed of bits, fields are composed of characters or bytes.
A field is a group of characters or bytes that conveys meaning. For example, a field consisting of uppercase and lowercase letters can be used to represent a person’s name, and a field cosisting of decimal digits could represent a person’s age.
A record is a group of related fields (implemented as a class in Java).
Example: payroll system. The record for an employee might consist of the following fields (possible types for these fields are shown in parentheses):
In the preceding example, all the fields belong to the same employee. A company might have many employees and a payroll record for each.
public class Employee {
private int idEmployee;
private String name;
private String address;
private double hourlyPayRate;
. . .
}
A file is a group of related records.
More generally, a file contains arbitrary data in arbitrary formats. In some operating systems, a file is viewed simply as a sequence of bytes. Any organization of the bytes in a file, such as organizing the data into records, is a view created by the application programmer.
It’s not unusual for an organization to have many files, some containing billions, or even trillions, of characters of information.
A database is a collection of data organized for easy access and manipulation.
The most popular model is the relational database, in which data is stored in simple tables. A table includes records and fields.
The amount of data being produced worldwide is enormous and growing quickly.
Big data applications deal with massive amounts of data and this field is growing quickly, creating lots of opportunity for software developers.
Programmers write instructions in various programming languages, some directly understandable by computers and others requiring intermediate translation steps. Hundreds of such languages are in use today. These may be divided into three general types:
Any computer can directly understand only its own machine language, defined by its hardware design.
Consist of strings of numbers (ultimately reduced to 1s and 0s) that instruct computers to perform their most elementary operations one at a time.
Machine dependent (a particular machine language can be used on only one type of computer)
Programming in machine language was simply too slow and tedious for most programmers.
Programmers began using English-like abbreviations to represent elementary operations. These abbreviations formed the basis of assembly languages.
Assemblers (translator programs) were developed to convert early assembly-language programs to machine language at computer speeds.
To speed the programming process, high-level languages were developed in which single statements could be written to accomplish substantial tasks.
High-level languages allow you to write instructions that look almost like everyday English and contain commonly used mathematical notations.
Compilers convert high-level language programs into machine language. Grace Hopper invented the first compiler in the early 1950s.
Compiling a large high-level language program into machine language can take considerable computer time.
Interpreter programs, developed to execute high-level language programs directly, avoid the delay of compilation, although they run slower than compiled programs.
Java uses a clever performance-tuned mixture of compilation and interpretation to run programs.
Java is the world’s most widely used high-level programming language.
Objects, or more precisely, the classes objects come from, are essentially reusable software components.
There are date objects, time objects, audio objects, video objects, automobile objects, people objects, etc.
Almost any noun can be reasonably represented as a software object in terms of attributes (e.g., name, color and size) and behaviors (e.g., calculating, moving and communicating).
Using a modular, object-oriented design and implementation approach can make software-development groups much more productive than was possible with earlier popular techniques like “structured programming”.
Object-oriented programs are often easier to understand, correct and modify.
A simple analogy: Suppose you want to drive a car and make it go faster by pressing its accelerator pedal.
Before you can drive a car, someone has to design it.
A car typically begins as engineering drawings, similar to the blueprints that describe the design of a house.
These drawings include the design for an accelerator pedal.
Pedal hides from the driver the complex mechanisms that actually make the car go faster, just as the brake pedal “hides” the mechanisms that slow the car, and the steering wheel “hides” the mechanisms that turn the car.
This enables people with little or no knowledge of how engines, braking and steering mechanisms work to drive a car easily.
Just as you cannot cook meals in the kitchen of a blueprint, you cannot drive a car’s engineering drawings.
Before you can drive a car, it must be built from the engineering drawings that describe it.
A completed car has an actual accelerator pedal to make it go faster, but even that’s not enough—the car won’t accelerate on its own (hopefully!), so the driver must press the pedal to accelerate the car.
Let’s use the car example to introduce some key object-oriented programming concepts.
Just as someone has to build a car from its engineering drawings before you can actually drive a car, you must build an object of a class before a program can perform the tasks that the class’s methods define.
The process of doing this is called instantiation.
An object is then referred to as an instance of its class.
Use a building-block approach to creating your programs. Avoid reinventing the wheel, use existing high-quality pieces wherever possible. This software reuse is a key benefit of object-oriented programming.
A new class of objects can be created conveniently by Inheritance, the new class (called the subclass) starts with the characteristics of an existing class (called the superclass), possibly customizing them and adding unique characteristics of its own.
In our car analogy, an object of class “convertible” certainly is an object of the more general class “automobile”, but more specifically, the roof can be raised or lowered.
Java also supports interfaces, collections of related methods that typically enable you to tell objects what to do, but not how to do it.
A “basic-driving capabilities” interface consisting of a steering wheel, an accelerator pedal and a brake pedal would enable a driver to tell the car what to do.
Once you know how to use this interface for turning, accelerating and braking, you can drive many types of cars, even though manufacturers may implement these systems differently.
A class implements zero or more interfaces, each of which can have one or more methods, just as a car implements separate interfaces for basic driving functions, controlling the radio, controlling the heating and air conditioning systems.
Just as car manufacturers implement capabilities differently, classes may implement an interface’s methods differently.
For projects so large and complex, you should not simply sit down and start writing programs.
Analyzing and designing your system from an object-oriented point of view, it’s called an object-oriented analysis-and-design (OOAD) process.
The Unified Modeling Language (UML) is now the most widely used graphical scheme for modeling object-oriented systems.
Operating systems are software systems that make using computers more convenient for users, application developers and system administrators.
Popular desktop operating systems include Linux, Windows and macOS (formerly called OS X)
The most popular mobile operating systems used in smartphones and tablets are Google’s Android and Apple’s iOS (for iPhone, iPad and iPod Touch devices)
Microsoft developed (mid-1980s) the Windows operating system, consisting of a graphical user interface (GUI) built on top of DOS (Disk Operating System)—an enormously popular personal-computer operating system that users interacted with by typing commands.
Windows borrowed from many concepts (such as icons, menus and windows) developed by Xerox PARC and popularized by early Apple Macintosh operating systems.
Windows 10 is Microsoft’s latest operating system—its features include enhancements to the Start menu and user interface, Cortana personal assistant for voice interactions, Action Center for receiving notifications, Microsoft’s new Edge web browser, and more.
Is a proprietary operating system—it’s controlled by Microsoft exclusively.
By far the world’s most widely used desktop operating system.
Linux operating system is perhaps the greatest success of the open-source movement (GNU/Linux naming controversy)
Open-source software departs from the proprietary software development style that dominated software’s early years.
With open-source development, individuals and companies contribute their efforts in developing, maintaining and evolving software in exchange for the right to use that software for their own purposes, typically at no charge.
Open-source code is often scrutinized by a much larger audience than proprietary software, so errors often get removed faster.
The Linux kernel is the core of the most popular open-source, freely distributed, fullfeatured operating system.
It’s developed by a loosely organized team of volunteers.
Linux has become extremely popular on servers and in embedded systems, such as Google’s Android-based smartphones.
Apple, founded in 1976 by Steve Jobs and Steve Wozniak, quickly became a leader in personal computing.
Apple’s proprietary operating system, iOS, is derived from Apple’s macOS and is used in the iPhone, iPad, iPod Touch, Apple Watch and Apple TV devices.
In 2014, Apple introduced its new Swift programming language, which became open source in 2015. The iOS app-development community is shifting from Objective-C to Swift.
Android is based on the Linux kernel and Java.
Android apps can also be developed in C++ and C.
One benefit of developing Android apps is the openness of the platform. The operating system is open source and free.
Developed by Android, Inc., which was acquired by Google in 2005.
Microprocessors have had a profound impact in intelligent consumer-electronic devices, including the recent explosion in the “Internet of Things”.
Sun Microsystems in 1991 funded an internal corporate research project, called Oak, led by James Gosling, which resulted in a C++ based object-oriented programming language that Sun called Java.
Using Java, you can write programs that will run on a great variety of computer systems and computer-controlled devices. “write once, run anywhere”.
It’s also the key language for developing Android smartphone and tablet apps.
Sun Microsystems was acquired by Oracle in 2010.
Java has become the most widely used general-purpose programming language with more than 10 million developers.
You can create each class and method you need to form your programs. However, most Java programmers take advantage of the rich collections of existing classes and methods in the Java class libraries, also known as the Java APIs (Application Programming Interfaces).
Normally there are five phases to create and execute a Java application:
Integrated development environments (IDEs) provide tools that support the software development process, such as editors, debuggers for locating logic errors that cause programs to execute incorrectly and more.
The most popular Java IDEs are:
Welcome.java
”, you’d type in your system’s command line:javac Welcome.java
Welcome.class
”Machine-language instructions are platform dependent (dependent on specific computer hardware), bytecode instructions are platform independent.
So, Java’s bytecodes are portable, without recompiling the source code, the same bytecode instructions can execute on any platform containing a JVM that understands the version of Java in which the bytecodes were compiled.
java Welcome
This begins Phase 3. IDEs typically provide a menu item, such as Run, that invokes the java command for you.
The JVM places the program in memory to execute it, this is known as loading
The JVM’s class loader takes the “.class
” files containing the program’s bytecodes and transfers them to primary memory.
Also loads any of the “.class
” files provided by Java that your program uses.
The “.class
” files can be loaded from a disk on your system or over a network.
In Phase 4, as the classes are loaded, the bytecode verifier examines their bytecodes to ensure that they’re valid and do not violate Java’s security restrictions.
Java enforces strong security to make sure that Java programs arriving over the network do not damage your files or your system (as computer viruses and worms might).
JVM executes the bytecodes to perform the program’s specified actions.
In early Java versions, the JVM was simply a Java-bytecode interpreter. Most programs would execute slowly, because the JVM would interpret and execute one bytecode at a time. Some modern computer architectures can execute several instructions in parallel.
Today’s JVMs typically execute bytecodes using a combination of interpretation and just-in-time (JIT) compilation.
In this process, the JVM analyzes the bytecodes as they’re interpreted, searching for hot spots (bytecodes that execute frequently).
For these parts, a just-in-time (JIT) compiler, such as Oracle’s Java HotSpotTM compiler, translates the bytecodes into the computer’s machine language.
Each of the preceding phases can fail because of various errors.
This would cause the Java program to display an error message.
So, you’d return to the edit phase, make the necessary corrections and proceed through the remaining phases again to determine whether the corrections fixed the problem(s).
Errors such as division by zero occur as a program runs, so they’re called runtime errors or execution-time errors.
Fatal runtime errors cause programs to terminate immediately without having successfully performed their jobs.
Nonfatal runtime errors allow programs to run to completion, often producing incorrect results.
In the late 1960s, ARPA—the Advanced Research Projects Agency of the United States Department of Defense—rolled out plans for networking the main computer systems of approximately a dozen ARPA-funded universities and research institutions.
The computers were to be connected with communications lines operating at speeds on the order of 50,000 bits per second, a stunning rate at a time when most people (of the few who even had networking access) were connecting over telephone lines to computers at a rate of 110 bits per second.
Academic research was about to take a giant leap forward.
ARPA proceeded to implement what quickly became known as the ARPANET, the precursor to today’s Internet.
Things worked out differently from the original plan.
Although the ARPANET enabled researchers to network their computers, its main benefit proved to be the capability for quick and easy communication via what came to be known as electronic mail (e-mail).
This is true even on today’s Internet, with e-mail, instant messaging, file transfer and social media such as Facebook and Twitter enabling billions of people worldwide to communicate quickly and easily.
The protocol (set of rules) for communicating over the ARPANET became known as the Transmission Control Protocol (TCP). TCP ensured that messages, consisting of sequentially numbered pieces called packets, were properly routed from sender to receiver, arrived intact and were assembled in the correct order.
In parallel with the early evolution of the Internet, organizations worldwide were implementing their own networks for both intraorganization (that is, within an organization) and interorganization (that is, between organizations) communication.
A huge variety of networking hardware and software appeared. One challenge was to enable these different networks to communicate with each other.
ARPA accomplished this by developing the Internet Protocol (IP), which created a true “network of networks”, the current architecture of the Internet.
The combined set of protocols is now called TCP/IP. Each Internet-connected device has an IP address—a unique numerical identifier used by devices communicating via TCP/IP to locate one another on the Internet.
Businesses rapidly realized that by using the Internet, they could improve their operations and offer new and better services to their clients.
Companies started spending large amounts of money to develop and enhance their Internet presence. This generated fierce competition among communications carriers and hardware and software suppliers to meet the increased infrastructure demand.
As a result, bandwidth—the information-carrying capacity of communications lines—on the Internet has increased tremendously, while hardware costs have plummeted.
The World Wide Web (simply called “the web”) is a collection of hardware and software associated with the Internet that allows computer users to locate and view documents (with various combinations of text, graphics, animations, audios and videos) on almost any subject.
In 1989, Tim Berners-Lee of CERN (the European Organization for Nuclear Research) began developing HyperText Markup Language (HTML)—the technology for sharing information via “hyperlinked” text documents.
Berners-Lee also wrote communication protocols such as HyperText Transfer Protocol (HTTP) to form the backbone of his new hypertext information system, which he referred to as the World Wide Web.
In 1994, he founded the World Wide Web Consortium W3C devoted to developing web technologies. One of the W3C’s primary goals is to make the web universally accessible to everyone regardless of disabilities, language or culture.
The applications-development methodology of mashups enables you to rapidly develop powerful software applications by combining (often free) complementary web services and other forms of information feeds.
ProgrammableWeb provides a directory of over 16,500 APIs and 6,300 mashups. Their API University includes how-to guides and sample code for working with APIs and creating your own mashups. According to their website, some of the most widely used APIs are Facebook, GoogleMaps, Twitter and YouTube.
Web services source | How it’s used |
---|---|
Google Maps | Mapping services |
Microblogging | |
YouTube | Video search |
Social networking | |
Photo sharing | |
Social networking for business | |
PayPal | Payments |
A thing is any object with an IP address and the ability to send data automatically over the Internet. Such things include:
List of buzzwords that you’ll hear in the software development community:
Software is complex. Large, real-world software applications can take many months or even years to design and implement.
When large software products are under development, they typically are made available to the user communities as a series of releases, each more complete and polished than the last.
There are many online forums in which you can get your Java questions answered and interact with other Java programmers. Some popular Java and general programming forums include: