ADT is a data collection and analysis company that provides subscription software tools to its customers. The main functions of ADT is to assist its clients in decision-making. It uses the large and consumer related data such as consumer profiles and purchase patterns etc. ADT provides an option to the clients to log into the portal where software for organizing the huge data according to their requirements. The clients can subscribe to the services of ADT by paying for usage time or annual fee basis.
The current information in the database of ADT is raw information. It is stored and processed by the software provided by the company as according to the use of clients. The clients subscribe to the services of ADT using the website for enhancing their E commerce business and the associated site. ADT’s uses single platform called “C-plank” that supports support e-commerce and traditional sales of complex products and pricing models. The company is spending time and resources to provide better solutions to its clients so they can easily align their product offerings, prices, and promotions through web and traditional channels. ADT arranges for collection of data through different sources so it can combine this huge bulk of data and offer results that are more accurate to clients helping them in developing strategies, decision-making, and product offerings.
Outsourcing Activity
In-house cloud computing services are yet not common in the company; however still ADT is providing cloud-computing solutions (especially SaaS) to its clients. ADT outsource most of its business functions to other companies that includes website maintenance, hardware maintenance, messaging services, and so on. However, these processes are innocuous, but the current CEO of the company views that in reality these outsourced processes form the integral part of the ADT’s core business processes, and are critical to the operations of the business, hence they need to be carried out in-house. The CEO emphasize on the significance of cloud computing benefits within the company, and wants to minimize the offshoring and outsourcing components. The company is deprived of certain risks due to too much outsourcing. Moreover, the board and top management do not have complete information regarding the business processes and therefore, partially disengaged from the core business.
Human Capital at ADT
The primary business functions of ADT are handled by three levels of professionals, i.e. Data analysts (three personnel), Business Analytics Optimizing Specialist (one personnel) and Director of Business Intelligence (one Personnel). There are many key activities of data analysts. One of the essential functions is to extract meaningful data from multiple sources. It includes distribution, sales, inventory, and customer databases of the clients. The sources are then linked with the market data for creating an integrated whole for decision-making. Second and third responsibilities requires working on multiple big and complex SQL databases, and designing and building simplified reports respectively.
Director of Business Intelligence job is to design and apply the total analytics and business intelligence strategy, and lead the client’s projects. Director is also responsible for understanding key business behaviors. These business behaviors are important as they determine retention, engagement, and client acquisition. In addition, he/she also identifies new business opportunities, develop forecasting principles, apply several statistical models, and design segmentation schemes. Director of Business Intelligence also continually seeks for upgrading the delivery solutions by collaborating with technology providers.
Optimizing Specialists develops Business Intelligence reports for clients and then use tools to convert business and IT information into forecasts, metrics, and plans etc. and provide insights for business development. They also work with clients to determine the information requirements, develop, and test reports, and design dashboards and user interface.
The technical support functions of the ADT includes, Software engineers, Networking and chief information officer, and systems managers, and the rest are marketing and administration staff. The current staff is well qualified and have enough experience to carry out their responsibilities in the proposed expansion plan. The Chief Information Officer (CIO) at ADT considers every application in the IT portfolio and applies it for business enhancement. Also, the CIO is expected to anticipate and prepare for all potential IT security issues. Another major task of the CIO is to align the services with internal capabilities and vice versa.
The Data Assets of ADT
ADT provide informational and analytical service to its clients who primarily do E-commerce. It is not just enough to give only hard transactional data such as amount, product, frequency of purchase and the demographics of the client’s customers. For making their solutions unique ADT classifies their data into four i.e., demographic, behavioral, attitudinal and interactive.
The data such as age, income, gender, educational qualification, etc. are demographic data. ADT collects these data through instances of ordering, purchases, surveys, short interviews, registrations, circumstantial probes, event participations, etc. Behavioral data includes information about patterns of customer’s behavior such as time of purchase, frequency of purchase, browsing patterns, device / mode used for buying, rating participation, writing reviews, presence in social media, etc. Many of the data in this category are gathered from the net activity, observation and reports by the customer in various occasions.
Interactive data includes the number of clicks in the web page, navigation paths, browsing patterns, search key words used, etc. The tools used for gathering information on these aspects are Google analytics, A/B testing and other technical instruments. Qualitative data about customers are gathered through feedback surveys, questionnaires and self-reports on aspects such as product preferences, opinions, inclination for references, brand preference and loyalty, general sentiments about the product, company and the website, etc. Though attitudes are intangible measures of the customer profile, they have big potential to affect data in other domains. ADT while profiling a customer takes holistic view and combines the existing data with the client’s data for creating reports.
Information Systems at ADT
For processing and deploying its services, ADT uses Sun Microsystem’s Polaris version 5.1. Solaris is a computer operating system developed by Sun Microsystems. The top management of ADT justifies the selection of this operating system, because it is believed that it seldom crashes and has many features that support the services of ADT. The management reports that in last two years the system was shut down only for 120 minutes because of the technical problems, suggesting that the current system ADT using is good enough.
A database is vital in information and data management as it helps in storing and retrieving related information in a multiuser environment. It also prevents unauthorized access, protects data and provides facility for failure recovery. ADT is using Client/Server Architecture on an Oracle database. Currently there is a dedicated/centralized data and application server, internet server and desk tops connected to the server in UNIX model. The company uses traditional database architecture, which stores all types of information needed for Business Intelligence. From this database, ADT drive its reports and dashboards for its clients.
Interface and Dashboard
ADT has a paradigm in its Analytics; it classifies its information, analysis and decision based on this paradigm. The major elements of the analytic framework used by ADT are descriptive analytics, diagnostic analytics, predictive analytics, and prescriptive analytics. Descriptive analytics describes what is happening, diagnostic analytics answers why something is happening, predictive analytics would provide information on what is likely to happen, and prescriptive analytics is for what should be done. However, presently ADT’s focus is on predictive analytics, and in course of its expansion it intends to strengthen the services in prescriptive analytics. Depending on the kind of analytics it provides to the clients, ADT designs user interface and dashboard adaptively.
Infrastructure and Security system
ADT has the following infrastructure to run its business, i.e., web data capture methodology to gather data from multiple sources, data warehouse, DBMS (Oracle), Statistical analytical tools (SPSS & SAS), Modeling and predictive tools, Reporting software’s, collaboration tools and Sun Microsystems network.
- Deliverable 2
- Project Overview
This is a data collection company that has been in operation for less than two years. The company aims to create a data repository that goes beyond normal relational databases. The 10TB data warehouse is expected to grow by 20% every year. This project aims to use the best practices in data warehousing to create a data warehouse that will be able to handle all the business and accommodate future growth. The project will cover the scope of the project, the requirements, both current and future, all the risks involved and all the assumptions that have been made.
- Background including current process
This project is the creation of a data warehouse that goes beyond standard relational databases that will be able to handle all the data is collected by the company. The new data warehouse should be able to meet the current requirements and be able to accommodate the projected growth. This means that the new database must be scalable to increase continuously with the requirements.
The project goal is to identify all the requirements of the business and develop a solution that is able to meet the requirements. The project should come up with an efficient solution and one that comfortably meets the minimum requirements identified. Due to projected growth of the databases involved, the project should be scalable to keep up with the growth.
The IT role for this project is to identify the best practices in information technology that will lead to coming up with the best possible solution. It should identify the best architecture to implement the solution to give the most efficient data warehouse.
- Scope of the Project
Applications
For this project, the access application and the database management application will be implemented. Access application will help in enabling the users to access the solution. It will be hosted in the application server and the database management application system will be hosted in the database server.
Sites
An offsite data center will be developed for the remote access of wireless devices.
Process Re-engineering
The process of re-engineering leads an organization towards new solution that can help in strengthening the functions of the company.
Interfaces
This project requires number of interfaces. Users would have limited access to these interfaces depending upon the position they hold at the organization. The primary interfaces are data entry interface, data retrieval interface, and dashboard interface. Data entry interface will help to enter data. Data retrieval interface help in selecting and retrieving required data from the system and dashboard interface would be available only to administrators. It assist them in monitoring the performance of the employees, troubleshooting, and maintenance of the system.
Architecture
The architecture selected for the organization would be three-tiered. Client-server architecture will be implemented in which client will access the solution. An application server will host the applications needed for access and therefore enable remote access to the system from a number of devices. The database server will be responsible for managing the database.
Conversion
The current data stored in the database will be transferred to the new system and converted as per the data type. This conversion would ensure that all previous data is available in the new system.
Testing
Testing is necessary to ensure the consistency of the system. For this reason, testing will be done on all the aspects of the new system. It would include the access, application, and database levels. In addition, regular checks will also be performed to ensure that the system is working at its optimum level.
Funding
Project funding is limited to the resources availed by the company.
Training
Training classes will provided to all the employees of the organization so that they can understand and learn to use the new system. It would include basic functioning of the new system applications and other details. However, every employee will only have to learn his or her part of new system.
- Constraints and Assumptions
The following constraints have been identified:
- The project will be completed in the short period. It means that the market assumptions made initially would also be applicable until the completion of the project.
- Since, the company is in its premature stage; therefore, it is difficult for the company to arrange enough funding for the project. The project will have to rely on creativity in order to keep the project costs down.
The following assumptions have been made in defining the scope, objectives and approach:
- The software and applications to be used in the project are easily available in the market so it would not be difficult for the staff to arrange for the required equipment. Second, the prices would also be affordable.
- The funding will be available promptly for the project. This ensures that project will be delivered on time. Therefore, it would also reduce the risk of losses or other expenses.
- Risks
The following risks have been identified as possibly affecting the project during its progression:
- Rapid changes in the technology are the biggest risks for this project. New organizations are stepping in the market. All new organizations implement the latest technology to their systems. Thus, new organizations can make this project an obsolete in short period.
- Market forces are risks that have to be considered in the implementation of the project. The forces of supply and demand will affect the viability of the project. The demand for the company’s service might change due to market forces and this will affect the need for the solution. There should be flexibility in the project design so that changes can be made later on.
- Scope Control
The control of changes to the scope identified in this document will be managed through the Change Control, with business owner representative approval for any changes that affect cost or timeline for the project.
- Relationship to Other Systems/Projects
Business units must information IT department of other business initiatives about the influences on the project. Some of the business initiatives that an organization would consider are that, during the course of the project, the company will look towards expansion in the regional market. This expansion will lead to more data collected and therefore affect the scope of the project. The initial expansion will require use of company resources, which might lead to financial constraints for the project. In order to cover for expansion, the company will seek to raise funds from investors. The project will act as an enticement to the investors as a sign of the company’s growth and future capabilities.
Deliverable 3
Data warehousing requirement
Database systems are used to transform raw data into meaningful analytics. These analytics are crucial for decision-making purposes. Relational databases are good but have limited capabilities when numerous amounts of data are to be analyzed. In case of ADT project, being the CIO of data analytic organization, I would design a database that compromises of data warehousing .
Data warehousing is a process of accumulating all the company’s data in a big data repository. All the data in an organization can be accumulated in a big repository such that seamless integration can be achieved. One characteristic of a data warehouse is that the operational data keeps on changing and as such, business operations continuously change. Thus, tables in a database are continuously altered or refreshed by removing old data and adding new data.
However, this old data is not removed completely; rather, they are accumulated leading to voluminous amounts, which range in terabytes and petabytes. This massive amount of data is what organizations are using in data analytics to derive important analytics, which are useful for decision-making. Warehouses provide a massive opportunity for organizations with large amounts of data and whose data storage and integration requirement are surpassing those provided by databases.
The warehouse is structured into specialty areas that represent different sectors in the business. There are two major sectors of consideration in this organization: web analytics and financials. The two sectors comprise of a culmination of details concerning the employees, department’s activities and project characteristics.
Object-oriented data modeling, when used, can lead to the achievement of the same results. Objects will be denoted by different classes with a provision to segregate functional primary data from processes that are employed in creating and modifying such data. The functional areas are of great concern and should, essentially be mapped out clearly during problem definition stage so that independency is developed with the processes. Because this provision is so important, design development team should take note to produce a functional database.
The web analytic company requires a data warehouse that is highly integrated. In this way, the design team standardizes common data presentation terms before mapping is done. The designed data house supports distributed and centralized data system that is essential for business operations. Data will be distributed across the entire system and will be accessed via organizational networks.
The objective of any design process such as a data warehouse is to achieve usability. A usable warehouse is simplified and offers much interaction between the users. The user interface is easy to navigate such that the learning curve is not steep to improve productivity. SQL is recommended as the preferred interface for the data warehouse because of its unique features such as multidimensional view of relational data better integrated interface. It also offers the options to retrieve analyze and format data effectively.
There is also a feature of history rewriting required for data warehouses. The warehouse should be able to allow successful data rollback via the implementation of if analysis. This functionality is achieved by granularizing data such that administrators can update rights of historical data. Data entered in a warehouse is non-volatile; hence data should be loaded in large volumes. Data is transitioned from the database to the new warehouse in a planned process that employed clearing tools. Data scrubbing, migration and auditing, should be conducted to attain consistency.
Finally, the warehouse should be highly flexible. Data Analytics Company using the warehouse is rapidly growing, necessitating a data schema with the capacity to accommodate new incoming data.
Schema
A schema is a logical representation of tables, views, and procedures in the database. The logical representation of the warehouse schema is represented as follows.
Entity relation Diagram
As shown above, entity relationship data model is used to design schemas and denote it with various entities. Entity relation diagram is a graphical representation of entities and their relationships about databases and warehouse development. Developing an entity relation diagram requires a number of considerations. In order to draw an entity relation diagram, major business operations are defined. The outcome of this definition is referred as the entities. The second in line is the attributes, which are defined by a variety of tables and storage fields in the warehouse.
This analytic company endears a data warehouse that is similar to a relational database system. Using entity relation diagram, the processes are defined and translated into advanced plans, enabling the developers to build a relational data warehouse schema that is in tandem with the functionality of the warehouse. Data in the old databases is moved to a new warehouse after the processes of cleaning, verification, and validation.
After these processes, data is stored in appropriate data marts.
The sources of data include customer’s information, supplier data, manufacturer and management data. Since, these data keeps on inflating day in day out, the warehouse should be scalable for that matter, allowing expansion wherever necessary. Data warehouses offer massive storage abilities that can accommodate numerous amounts of data unlike databases with limited capacities.
Data Flow Diagram
A DFD illustrates the flow of information between the organization and the clients. When a company exchanges data between itself and its customers, employees or suppliers, records and transactional processes remain traceable. This information is contained in websites and information systems in theform of site visits, web log files, Java and PHP scripts, among others. Those retrieved from information systems include financial data, customer data among other metrics. These data is feed into the company for processing, and the output is web reports, market reports, among other deliverables.
As shown in the table above, client information is feed into the system for analysis. It is then analyzed and a report feedback to the client. It is after the data has been subjected to analysis tools and processes to output graphics, reports, and scorecard statistics. Each data mart is represented by a unique type and format of data. Data in sales and finance departments may be denoted by currency values while those in engineering and design sections may be represented as graphs and design objects. Data in each mart is mapped to the warehouse where it is processed to deliver analytics represented in the form of graphics and reports .
Project plan
Deliverable 4
Warehousing and storage of data is not the only job of IT professionals but they also have to provide the infrastructure to organization for the analysis and decision-making processes. These data analytics are aimed at providing customer satisfaction, reducing costs, and fulfilling other client requirements. The availability of up-to date, quality controlled data is essential for making insightful decisions in an organization. The ability to analyze big loads of data in real time enhances an organization’s competitiveness in terms of quality control, customer satisfaction, and warranty costs.
On the other side, data analytics is an intensive process that requires resources and knowledge to execute it. For organizations that do not have idle servers to process big data workloads, tapping into the public cloud is a viable alternative to keeping costly internal infrastructure.
Need for data analytics
An organization can leverage data analytics to better its competitive edge over rivals in the same industry. Data obtained from the web is integrated with the operational data it obtains from its information system to derive insights valuable for decision making. The data obtained in the data mining process need to be analyzed, aggregated at different levels of abstraction and finally put to good use. It is what e-commerce sites have been utilizing to understand customer behavior and strike a relationship. Data content is usually comprised of textual and graphical representation sourced from HTML and XML pages. The content is made of semantic and structural meta-data that are embedded within pages and can be extracted from using descriptive keywords.
Web analytics is essential for both large and establishing businesses. It is because such data is used to track defining business metrics and develop dashboards which when applied to the business environment lead to numerous business opportunities .
A company using data analytics for decision-based management is more competitive than those that do not. In addition to organizational data from information systems, data from social networks can be integrated with an analytic process to facilitate an all-round analysis. A company seeking to increase its market coverage can, for instance, employ social networking sites to study their customer trends and tailor their products and services to meet their needs.
Interface
An analytic tool provides an interface for the user to navigate through and work around the analytic process. In this way, the user understands where they are, what they need to know, and what has been done. It is notable that data analytics issues have to do with getting knowledgeable personnel who understands how to represent business question to data and derive workable insights. If the interface is difficult to work with, the situation is worsened, and the process would lead to incorrect and unreliable results .
The design consideration of data analytic tools includes simple and easy to work with a platform that are easy to understand. The basics are user-centric models that equip the user with sufficient content to set modeling parameters and improved visual with sufficient spacing and position markers for improved summarization. Some of the screen shots for data analytic tools are as shown in Appendix 1
Analytics-as-a-service is a trend facilitated by the large number of SaaS vendors on the ground. The usefulness of the clouds in respect to data analytics is founded on zero requirements for infrastructure and configuration. It is the greatest selling point adopted by organizations such as Emcien Corp. The company offers a pattern resignation software as a service on Amazon Elastic Compute Cloud and sources clients from the telecommunication industry, large-scale retailers, and intelligence agencies among others. There is no need to acquire any physical infrastructure, rather; a browser is required, and the user is good to go .
When it comes to analyzing voluminous amounts of data public clouds is the better option than in-house establishments. Public clouds utilize pay-per-use model making them a good fit with finite data loads. In the clouds, big data analytic jobs can be parallelized or subdivided into smaller discrete tasks which map effectively. In addition, big data analytic companies have developed special templates for popular big data platforms like Hadoop. It makes it easy for administrators and data analysts to set up the required infrastructure for data processing.
For big data workloads, the only viable solution is the clouds. Since data in organizations aggregate over time and form large amounts spanning into terabytes and petabytes, clouds offer a scalable solution that can be extended indefinitely. For example, Medio, a data analytics company, offers real-time multi-tenant architectures for its customers tapping into the scale of Amazon Web Services to complement its data center. Media can handle one billion data events in a day with a user base of over 15 million on the month. Even for modest workloads, cloud is still the best alternative since it provides a good spin-up and spin-down.
One characteristic of data analytics is that it is unpredictable. When a business question has been answered, the underlying infrastructure is not required again as business dimensions shift. A caveat is that big data analytics on the clouds should not be an all-or-nothing proposition. Usually, an organization can gain value by maintaining a hybrid approach suiting the needs of their organization. For example, Archimedes Inc. a San Francisco based company manages a private Hadoop cluster for processing data using Univa Grid Engine software but keeps the front end on the AWS cloud .
In the same way, data analytics enhances data administration as the work shifts from more analysis and less administration. Service delivery is increased even with decreased resources and workforce because of the multiple layers of data derived. Using data visualization techniques and dashboards, reporting is enhanced as data can be segregated according to regions, demographics, and operational channels. This information is essential in rectifying underperforming sectors as well as identifying untapped markets for further exploitation.
The lack of professionals in managing data in the clouds is the leading cause of inefficiencies followed by security and physical constraints. The problem usually concerns lack of train professionals who pose business questions of the data in a meaningful way. While data latency and movement issues can be solved by time, money and technology, data science and manipulation skills are hard to come by.
Recommendation
Security in cloud platforms is applicable to data analytics just as any other information technology project. Data in the clouds pose serious challenges that can affect the business environment if not handled correctly. For instance, data breaches while in transit for processing or when in the clouds causes serious reputational and financial implications, which are irreversible, and damaging. To ensure that exchange of information is protected, technological, logical, and operational safeguards must be enforced. Data should, for instance, be encrypted for transmission to cloud analytic vendors and back. However, in cases where such security methods cannot provide a solution, I recommend provision of analytic services as a service.
The proliferation of security breaches has prompted organizations to establish a more solid grasp of their data. Organizations demand that their data be inside their premises for greater security management and greater comfort. Cloud based analytical tools are a good for organizational performance but in cases where an organizational cannot compromise its data being on the clouds, virtual analytic tools which can be run in-house have been developed. This hybrid concept is the best solution so far for traditional on premise technologies. An example of such a service is HP’s Cloud System-converged infrastructure system offering the same cloud management functionalities as public cloud options and making it easier for companies to burst into the cloud concept to satisfy their demands.
Deliverable 5
An analytic company would obviously want to utilize data analytic processes to advance their business initiatives. Data analytics is a growing phenomenon where a company derives value from the silos of data in their disposal. The need for data analytics is founded on studying business trends and forecasting to enhance business activities. Therefore, a company that is experiencing tremendous growth because of business practices will embrace data analytics. For the analytic process to be successful, the infrastructure upon which the process takes place should be laid down. With the need to maintain data confidentiality, availability and integrity at all times, the infrastructure adopted should be secured from any possible attacks. A secure network and working environment are essential as the organization plans to expand physically and logically. This organization plans to expand its operations from one floor to three floors, and for it to achieve full business potential, the network should be expanded too. It is so that efficient and seamless communication and internet availability is achieved between the three floors.
Network infrastructure
A proposed network for this company is to use a combination of wired and wireless LAN for its operations. A wired LAN will meet the needs of users using workstations, printers, and scanners while wireless LAN will accommodate mobile users using laptops, iPads, tablets, and Smartphones. This networking solution is guaranteed to offer quality service as well as protect the organizations data from internal and external attacks.
A network diagram depicting the current composition of computing resources in an organization is as follows:
The proposed network design encompasses wired network for stationary resources such as printers and workstations and wireless extension to accommodate mobile users.
The network diagram is as shown below:
The components of the network are:
Wired network
Wired network is implemented for the stationary or wired devices at the organization. Organizations subscribe to the ISP services for an internet connection and implement through Ethernet standards. CAT 6 cables are connection end-to-end with devices for deploying local area network (LAN).
Wireless router
Wireless router is used as a control device for the clients in the network. The main purpose of the wireless router is to control the security features of the devices that access the network. Wireless router controls access to the network and other important functions; therefore, it is considered as the backbone of the network. Cisco offers variety of wireless routers as according to the requirements of the users. Cisco Aironet 3500p is selected for this task as its access points are the latest access point offering from Cisco that solves layer 2 problems associated with previous routers and access points. Such kind of routers are mostly used for the high-density organization environments.
Access controllers
Access controllers are laid between the access points and the secure part of the network. It is installed on the wired component of the network. Access controllers survey the network to determine the load of traffic.
Cisco 5500 series wireless LAN controllers are founded on the 802.11n wireless technology and can communicate with the network via up to 8 physical data ports. This router can support speed of up to 300 megabytes per second and consist of a WLAN interface and an in-built NAT that allow multiple PCs to share internet. It also has utilizes advanced MIMO technology and VPN pass through.
Wireless Antennas
Wi-Fi Antennas are used to improve the communication range of wireless radio signals.
Switches
Switches are important component of the network and are connected to WLAN access points via switch ports. They develop a gateway to wired networks and allows all frames from the WLAN clients to pass through the switches to the enterprise network. Cisco Catalyst 3750G-24WS WLAN controller is preferred supporting up to 50 access points per switch and 200 AP per stack. It offers enhanced security, mobility and ease of use and supports 24 Ethernet 10/100/1000 ports with power over Ethernet and two SFP trans-receiver-based GE ports.
Security enhancement
The organization set specific security standards and protocols as per the needs and operation of the organization and information. As per the needs of this project, security enhancements will be a combination of technical and logical methods. Wired component of the Firewalls will be used for the security of wired networks and encryption techniques for the wireless network. In case of this project, ZoneAlarm is recommended for the installation of firewall.
WLAN can come across various problems ranging from opening ports, connection problems, extending the rage, and security issues. These issues affect reliability, security, and performance of the network.
WPA increases the level of data protection and access control on WLAN, addressing all the vulnerabilities associated with WEP. WAP secures all existing versions of 802.11 devices be reducing the impact of network performance on Wi-Fi devices. WPA implements stronger authentication algorithms as well as user authentication to ensure that all data remains private to authorized users alone. It uses Temporary Key Integrity Protocol to encrypt data and employ 802.1X authentication with Extensible Authentication Protocol. Further, it employs Message Integrity Check to enforce data integrity and protect the networks of hacking attempts. Encryption used is 128-bit keys with dynamic session keys meaning every key is different per session, per user and packet. The advantage of using TKIP over WEP in WLAN is key rotation. TKIP changes keys used for RC4 for every 10,000 packets and creates Initialization Vector in a different way .
Security policy
Security policies are the set of legislation that are proposed to determine the course of action of the authorized users as per their credentials. The ADT will implement a security policy to improve the level of security in organization. These security policies will ensure safety against unauthorized access and impose information confidentiality, integrity, and availability. Most organization implement network access policy; since, it is checks for who is granted access to the network and under what conditions. A network access policy is as follows:
Network use policy
Organizations create a network to carry on business activities and research and development. For this reason, network engineers segregate the network as according to the requirements of the project. It also ensures the access of users to different networks based on their position at the organization.
Authentication for External Connection
Remote users or ones using external connection will be authenticated before they are given access to use organization resources. Chief information officer authenticate remote users after ensuring certain conditions are met. In order to ensure the protection of diagnostic ports, unnecessary ports attached with the network are disabled. Similarly, before providing an access to third parties, they are first authenticated .
Wireless devices policy
Network intruders usually attack the network through wireless devices; therefore, they are restricted too. They are first authenticated and then provided an access. Authentication of wireless devices is made through paraphrases allotted to authorized users only, which is provided by the chief information officer. Users are required to guard the passphrases and must not share with anyone or external parties.
Network segregation
The process of network segregation is carried out on the basis on network resources. There is a separate network preserved for top management and mission critical operations and others for normal organizational operations. There are different set of credentials set for the employees to access any network.
Before developing new networks, sufficient testing for efficiency and security is affirmed and then it is separated off from the main network.
Ethical concerns
As with the large usage of computers, the cases of misuse of computer resources have also risen. ADT is also experiencing such kind of threats. These threats are usually received from the insider staff and third parties. In order to deal with such threats and security issues, a comprehensive security framework is required defining disclosure of information, and employees’ interactions. This framework includes detail passwords, security policy and details of use, conduct, and employee behavior.
The Chief Information Officer should ensure that the high-level security. Passwords and paraphrases are used for verification and therefore, they must be changed after every short period. Employees should set strong passwords for their accounts. A 7-character alpha-numeric combinational password is considered as the strong password .
Organizational data is highly confidential and must not be leaked in any case. Therefore, top management should disable file sharing for organizational staff or it should only be enabled for specific staff under supervision. Moreover, management should also implement safety policies and agreements with the staff to secure the confidential organizational data. They should be warned, if any case of data breaches occur related to confidentiality, availability of information, and integrity, then employees will personally be liable and will be subjected to disciplinary actions.
References
Thomas H. Davenportand Jeanne G Harris. (2007). “Competing on Analytics: The New Science of Winning”, Harvard Business Review Press, Boston.
Cynthia Snyder and Frank Parth (2006). “Introduction to IT Project Management”. P.393-397
Carl Chatfield and Timothy Johnson (2007). “Microsoft Office Project 2007 Step by Step”, Microsoft Press.
Lewis, J. P. (2005). Project planning, scheduling, & control: A hands-on guide to bringing projects in on time and on budget. New York: McGraw-Hill.
Haugan, G. T. (2002). Project planning and scheduling. Vienna, Va: Management Concepts.
Harrington, J. L. (2002). Relational database design clearly explained . Morgan Kaufmann,.
Series, S. C. (2012). System analysis and design 10th edition. Rosenblatt .
Brown, B. (2013, August 12). How Analytics will Lower costs and reduce waste for health care industry. Health Catalyst.
McCann, E. (2013, October). Clinical data analytics next big thing: Population health needs clinical analytics solutions.Health IT News.
Pethuru Raj, . C. (2014). Handbook of Research on Cloud Infrastructures for Big Data . Springer.
Raghupathi, W. R. (2014, July 7). Big data analytics in healthcare: promise and potential. Health Information Science and Systems, p. 2:3.
Sandro Fiore, G. A. (2011). Grid and Cloud Database Management. Springer.
Deborah Morley, C. S. (2009). Understanding Computers 2009:. Cengage Learning.
Harold F. Tipton, M. K. (2006). Information Security Management Handbook, Fifth Edition, Volume 3. CRC Press.
Kennedy Clark, K. H. (1999). Cisco LAN Switching. Cisco Press.