2,294
Views
15
CrossRef citations to date
0
Altmetric
Full Research Papers

The Big Data concept as a contributor of added value to crisis decision support systems

&

Abstract

The explosive growth of the Big Data concept and its applications in multiple domains of human activity has increased interest in the benefits it could offer to public services. The paper seeks to emphasise the added value that stems from the use of Big Data in handling heterogeneous data sources accessed by crisis management structures. Authors wants to highlight the ways in which the concept could support decision makers during the crisis management process by showing two cases. First authors’ own creation known as Hybrid Decision Support System for Crisis/Disaster Management. Second, showing application of Big Data in security management to illustrate the implications of using Big Data in practice. In case of that the Big Data is a term that is vendor driven and creates more confusing than clarity author’s conclusion brings together critical observations and judgments voiced in the paper using SWOT analysis toll and providing a blueprint for further development of the concept of Big Data in the area of Crisis/Disaster management.

Introduction

‘640 K ought to be enough for anybody’, announced Bill Gates in 1981 (Lai, Citation2008). Today, an average personal computer has some 4 GB of operating memory, which is more than ten thousand times as much. This example very aptly illustrate the incredible amount of data that we tend to collect these days. Email, mobile device location tracking, social networking sites such as Facebook, LinkedIn, Flickr, Twitter, as well as the increasingly popular blogs and the proliferation of mobile devices – all these account for an exponential growth in the amount of data that can be processed and examined for useful information.

A number of examples are employed in the paper to demonstrate how Big Data can benefit institutions concerned with security and continuity in a given area. The authors have also given due attention to risks involved in gathering, processing and sharing data that may potentially contain sensitive information, whether private or corporate. It is easy to see that any information indicative of weaknesses of public security systems could be of value to dishonest individuals and businesses as well as to organised crime and terrorist groups.

The discussion in the paper builds on the case of the Hybrid Decision Support System for Crisis/Disaster Management introduced at former IFIP Working Group 8.3. Conferences (Stanek & Drosio, Citation2012); (Stanek, Namyslo, & Drosio, Citation2013). By incorporating the use of Big Data to extend and enhance the original idea, the authors seek to emphasise the immense potential and relevance of the hybrid approach in such turbulent and volatile decision contexts as crisis situations and business discontinuities. To show how useful can be Big Data during the decision-making process connected with crisis management second case will show what benefits can be earned by using this technology in daily basis.

The use of Big Data to support the emergency decision-making process is perceived in the paper as a starting point for the construction of models that envision relevant applications supporting crisis management structures in attending their various responsibilities, addressing various duties the performance of various responsibilities.

Big Data as a subsequent stage in the evolution of the Data Mining concept

The originators of early decision support systems strove primarily, much like present-day designers will do, to deliver the right information to the right person in the right format and at the right time and place (Stanek & Drosio, Citation2012). Characteristically, most of them would adhere to very general definitions of the decision-making environment. This discussion sets out from Klein and Methlie definition of a decision support system (DSS) as

a computer information system that provides information and methodological knowledge (domain knowledge and decision methodology knowledge) by means of analytical decision models and access to data bases and knowledge bases to support a decision-maker in making decisions in complex and ill-structured tasks

(Klein & Methlie, Citation1995). Under this definition, decision support systems are placed in a setting that makes it easier to link their original formulation to the Big Data concept – one that, at this point, represents the state of the art in data analysis for decision support purposes.

The emergence of computer systems capable of supporting the work of managers was preceded and accompanied by a proliferation of two classes of IT tools addressing the needs of business organisations (Stanek, Citation2007):

Office Automation Systems (OAS) geared to raise the efficiency of specific organisational structures by facilitating the production and circulation of documents and correspondence (letters, memos, studies, reports, presentations, etc.).

Management Information Systems (MIS) that were designed to support the processes of organisational planning and control and to help generate reports.

This evolution trend responded to the increasing reporting requirements that integrated management systems were expected to meet. Information systems built around a transactional data base were unable to satisfy all those needs and soon became obsolete, even if they are in use in most organisations unto this day.

The advances outlined above encouraged the idea of independent systems characterised by high efficiency in providing users with information that can assist them in making business decisions. Being aware of the amounts of data to be processed and the known limitations of transactional data bases in that respect, the originators of these new-generation systems utilised a new type of resource termed as a Data Warehouse (Stanek, Citation2007).

Decision Support Systems (DSS) – seen as a real breakthrough in business informatics. As the name suggests, such systems aim to support organisational decision-making processes.

Experts systems based on the implication of an ‘if’ function: if – a condition (or a set of conditions), then – stated actions (decisions).

Importantly enough, both Data Warehouses and transactional data bases involve a sequential approach to data analysis, which, given the nature of the ETL (Extract, Transform and Load) tools used, hardly permits ad hoc analysis, hence compromising functionality. Yet, it was not until the technology came into popular use that the problem was properly diagnosed and a solution was proposed. As a result, a sub-discipline of computer science and statistics emerged that is now designated as Data Mining. The knowledge discovery process, an alternative name for Data Mining, consists in searching data for correlations that may be indicative of linkages and relationships, while the tools employed in performing such searches are mostly based on the Data Warehousing paradigm. That was only a middle stage for creating new paradigm called today Big Data. Under this name we can find a new approach to data research which can be used during the decision-making process and can be very useful for decision makers who are suffering from lack of structured information during the crisis and disaster. This violent and turbulent environment open for Big Data new opportunities not only for using this concept but also for a new direction of future R&D projects.

Data sources and their characteristics

When it comes to Big Data, it is difficult to delineate a closed data-set, particularly of unstructured data. To underscore the magnitude of this resource, Figure indicates some of the potential sources of Big Data.

Figure 1. Examples of Big Data sources (Wieczorkowski, Citation2014).

Figure 1. Examples of Big Data sources (Wieczorkowski, Citation2014).

Their characteristics are reflected in the so called ‘5 Vs’. The first three of the Vs were defined even before the arrival of Big Data and continue to be seen as original and most important (Tamhane & Sayyad, Citation2014):

Volume – that refers to the vast amounts of data generated every second. With 90% of the world’s data created in the last 2–3 years, the volume of data that is accumulated daily is what presents immediate challenges for businesses.

Velocity – that refers to the speed at which new data is generated and the speed at which it moves around. For example, the New York Stock Exchange captures about 1 terabyte of trade information daily. Reacting fast enough and analysing the streaming data is troublesome to businesses, with speeds and peak periods often inconsistent.

Variety – that refers to the different forms of data that we collect and use. Data come in different formats, e.g. they may be structured or unstructured. With the recent craze for social media, the percentage of unstructured data has gone up to above 80% of all data around the world.

The remaining two Vs originate in IBM’s Big Data projects, with subsequent contributions from recent research targeted on determining how much data are worth to an organisation:

Veracity – that refers to the uncertainty surrounding data, which is due to data inconsistency and incompleteness and which entails another challenge – that of keeping this huge amount of information organised.

Value – that refers to the effects of data mining and analytics used to expose valuable business information embedded in structured and unstructured data, data streams and data warehouses.

To see what these characteristics mean in the realm of crisis management, it seems advisable to bring up some basic examples of the types of data that are used by public services in pursuing their goal of advancing public safety and predicting crisis events:

(1)

Structured data examples: amount of precipitation; river levels; road traffic; number of local security incidents (e.g. fires, car crashes); wind speed and direction, data from injured individuals’ personal GPS receivers; traffic statistics from smart city management systems; forces and resources used to control a specific type of incident (e.g. the amount of extinguishing agents used to put out a fire in a two-story building).

(2)

Unstructured data examples: photos shared on local social networking sites; comments posted on local social networking sites before, during and after the incident; video recordings from public security surveillance (CCTV) systems; social media posts from residents of potential impact areas (e.g. notices of power failures, interruptions to Internet access, people’s irregular behaviours, or unusual phenomena).

The HACE theorem and Big Data within a decision making process

Looking at the Big Data definition derived from the HACE theorem (Wieczorkowski, Citation2014), one could inquire about how crisis or business continuity management structures can be enabled to cope with ‘large-volume, Heterogeneous, Autonomous sources with distributed and decentralised control, and […] Complex and Evolving relationships among data’ (Tamhane & Sayyad, Citation2014). Before this question is answered, one needs to contemplate how data sets labelled as Big Data are built and at which stage of an emergency decision-making process such data sets, given their characteristics, can be exploited.

It’s a rope no, no, it’s a hose it’s a wall it isn’t! sure it’s a tree!

These contentions illustrate how contradictory our determinations can be if we investigate isolated fragments of reality described by large data sets. Efforts to examine such fragmentary ‘snapshots’ found in large volumes of data characterised by the HACE criteria can be compared to the four blind men’s attempts at identifying a huge elephant standing in front of them (cf. Figure ) (Tamhane & Sayyad, Citation2014).

Figure 2. An example of how limited perceptions may affect our image of what is right in front of us (Tamhane & Sayyad, Citation2014).

Figure 2. An example of how limited perceptions may affect our image of what is right in front of us (Tamhane & Sayyad, Citation2014).

This intelligible example makes it easier to understand the difficulty faced by analysts handling Big Data. Unless they adopt the right perspective and thoroughly investigate the relationships among specific areas, they will arrive at nothing but fragmentary insights and, more often than not, misleading conclusions.

There are further pitfalls in analysing vast data sets. First, the elephant keeps growing very fast. A glance at several statistics cited by Usama Fayyada at the KDD BigMine 12 Workshop will let anyone get a grasp of the magnitude of these data sets (Tamhane & Sayyad, Citation2014):

Google receives more than 1 billion queries per day;

more than 250 million new tweets appear on Twitter per day;

there are more than 800 million updates to Facebook per day;

YouTube gets more than 250 billion views per day;

the volume of data that are generated these days is estimated in zettabytes, and grows by around 40% each year;

a new large source of data is associated with mobile devices and major ICT companies (Google, Apple, Yahoo).

Second, the blind men looking at different elements of the data-set will access more and more sources that will boost their confidence in their observations but actually may further blur the picture, to the effect that errors are transmitted among the blind men, the way they would be from one analyst to another, broadcasting erroneous perceptions of the nature of the object (Wu, Zhu, Wu, & Ding, Citation2014).

Properties of the decision making process in crisis management

Under Polish legislation, the crisis management process is defined as ‘activities conducted by public administration as part of national security management, aimed at preventing crisis events, preparing to contain them through planned action, responding to crises as they arise, and restoring or refurbishing any affected infrastructure.’ (The Crisis Management Act [of the Parliament of Poland] of April 26, 2007, [of the Parliament of Poland] of April 26, 2007, Citation2011) It should be noted that the definition makes crisis management appear as a mechanism that is only activated when an emergency threatens to strike the area governed by a given local government unit. If this definition were adopted for our analysis, then the Four Lights Platform would not make any sense, and it would be barely thinkable to attempt incorporating Big Data into crisis management processes. Considering this, the authors have chosen to go beyond the statutory definition of the crisis management process and, in this paper, rely on designations found in other laws and international regulations describing respective states and phases in the operation of a crisis and continuity management system based on the three-stage model prevalent in Europe and the US (Stanek & Drosio, Citation2012):

(3)

the pre-crisis stage (prevention and warning),

(4)

the crisis stage,

(5)

the post-crisis stage (recovery).

For the sake of argument and in the context of contentions made by advocates of Big Data, the subsequent chapters will try to demonstrate that knowledge derived from these data sets can be critical to the efficient performance of the prevention and early warning functions of crisis management systems, i.e. at stage one of the process. Once rich data feeds from diverse sources are available and there are efficient tools to examine them, even very limited analysis can produce useful diagnoses and hypotheses that could timely alert public safety stakeholders and decision makers to harbingers of imminent dangers and discontinuities.

Big Data applications in detecting threats

Two major groups of Big Data are distinguished (Tamhane & Sayyad, Citation2014):

structured data – denoting numbers and words that can be easily categorised and analysed. Such data are generated e.g. by network sensors embedded in electronic appliances, smartphones or Global Positioning System (GPS) devices. Structured data may also include sales figures, account balances, transaction data, etc.

This type of data is commonly employed by Business Intelligence systems as well as by crisis management bodies. The latter usually obtain such data e.g. from a variety of transactional systems running at offices of government administration, from local/regional public services, and from progress/completion reports on crisis response plans and routines. Under the Four Lights Platform concept presented in this paper, structured data are at the heart of a data warehouse – the centrepiece of the ‘Green Light’ module described in Table . Clearly, it is this group of data that drives the development and use of most Business Intelligence systems in crisis management.

Table 1. An overview of the Four Lights Platform components (Stanek & Drosio, Citation2012).

A new trend in crisis and discontinuity prediction is associated with attempts to analyse the other type of data, namely:

unstructured data – including information than is more complex than that contained in structured data, such as: customer reviews from commercial websites, photos and other multimedia content, or comments posted on social networking sites. These data cannot be easily separated into categories or analysed in numerical terms.

‘Unstructured big data is the things that humans are saying,’ says Tony Jewitt, Vice President of a consulting firm from Texas, USA (Thakur & Mann, Citation2014). What differentiates this group of data from its structured counterpart is that the language used to describe the specific elements of a data-set is hardly formalised. Obviously, this means that difficulties will arise where the data base contains such data and a machine is charged with the task of querying the data analysis system. Therefore, while there is no doubt that unstructured data constitute an immense wealth of data that could be used in crisis prediction, they pose a challenge that the authors’ conceptualisation of a decision support system for crisis management is yet to address. Further in this paper, however, the authors present a few relevant examples of predictive models founded on unstructured Big Data.

Case 1: The role of Big Data within the conception of the hybrid decision support system for crisis/disaster management

The approach summarised above inspired this paper’s authors to look again at their conception of the Hybrid Decision Support System for Crisis/Disaster Management (HDSSCM), which they put forth nearly four years ago (Stanek & Drosio, Citation2012), and delineate a Big Data application area that would be not only interesting scientifically but also very promising in terms of predictive and analytical potential. The reasons why the Big Data concept is regarded as a support tool solely in the context of crisis and continuity management will be enlarged up in a subsequent part of the paper.

It should be considered at this point how the concept could be exploited to support decisions in this specific area (i.e. crisis and business continuity management). Table overviews the characteristics of respective HDSSCM component modules along with the areas to which they address support, and the supporting technologies that could be deployed in each area.

The concept was supposed, in the first place: (1) to integrate heterogeneous data by developing an appropriate domain ontology; (2) to pre-process data for access by both front-end (i.e. residents in a given administrative unit) and back-end (i.e. organs of public administration and members of the business community participating in a specific crisis management system) users; (3) to perform and update analyses, drawing on an array of technology innovations and novel methodologies (e.g. fuzzy logic, software agents, semantic modelling); (4) to support decision makers by delivering the right data at the right time and place, thus improving the quality of decisions (Stanek et al., Citation2013).

To further enhance the quality of decisions while at the same time reducing the decision-making lag, on top of all these concepts and technologies will be added Big Data – an inevitable step in view of what has already been said in (the initial chapter of) this paper. This addition is in fact inscribed in the design of the ‘Green Light’ platform, whose support function is realised primarily by devising a wide range of uses for very large data sets coming from diverse sources, such that inference processing of these data can effectively contribute to improving safety in a given area.

Case 2: Big Data in New York City management – the power of structured data

In 1990, as New York City was facing the problem of bringing down the sky-rocketing crime rate, its leaders began by upgrading the existing computerised decision support tools to a system providing online access to a rich variety of records and statistics, which led to a substantial reduction in law enforcement response times. William J. Bratton, who took over as commissioner of the New York City Police Department (NYPD), had an ambitious goal of re-orienting the police force toward proactive crime prevention rather than merely responding to breaches of the law as they occur (Glisinan & Stephan, Citation2014).

The first step was to launch and orchestrate Compstat across departments and precincts – a system that would make use of historical data to help align the allocation and deployment of resources to crime incidence and thus reduce response times. As a result, New York Police Department operations were supposed to conform to Bratton’s motto: ‘Your arrests should be where the problems are’ (Glisinan & Stephan, Citation2014).

Between 1994 and 2010 Compnet, which was originally developed as a simple asset management system, evolved into a tactical-level decision support system (Glisinan & Stephan, Citation2014) that, in addition, aided governmental investment and development decisions.

The following case can illustrate ways in which Big Data, if collected and analysed properly, can be exploited in day-to-day operations:

(6)

Problem definition: Poor fire prevention and large distances between where firemen were stationed and where fires would occur most often.

(7)

Problem solution:

o

Fire Department’s reports suggested that the fire prevention problem was the most severe in quarters where the dominant type of architecture was an old tenement house. Using a geographical information system and historical data on Fire Department’s activities, maps were developed showing the spots with the highest fire hazard.

o

Since many of the areas with the highest risk of fire were inhabited by individuals with an ambivalent attitude toward public officers and toward accepted standards of community life, the Department also examined tax information for overdue tax and traffic ticket payments and looked into statistics on law enforcement incidents in a given area (Example of visualization is given in Figure ).

o

To further refine and extend the findings, data were also sourced from the power company to help identify locations where demand for electricity tended to rise rapidly on weekends and in the afternoon.

(8)

Contribution of Big Data analysis: Improved performance of such services as the police, the fire department, and building inspectors, resulting in 40% fewer fires in 2012–13 (Rożek, Citation2015).

Figure 3. A map of New York including crime statistics. Available at http://maps.nyc.gov/crime/.

Figure 3. A map of New York including crime statistics. Available at http://maps.nyc.gov/crime/.

Use of Big Data: strengths and weaknesses; opportunities and threats

It must be borne in mind, however, that there are multiple challenges in handling continually expanding data collections, some of which include so called data reservoirs (Nair, Citation2011) and should be therefore associated with an array of risks and threats that are likely to materialise. To explore the issue in more depth, the authors have performed a SWOT analysis – one of the most frequently used pieces of analytics in the ICT industry. Table brings together the key internal and external factors for and against wide use of Big Data as a support tool for decision-making within crisis management processes.

Table 2. A SWOT analysis of Big Data usage in the area of crisis management.

The overall picture of Big Data that seems to emerge from this discussion is confusingly ambiguous. Admittedly, the use of Big Data by offices and institutions of government is where the picture is least clear, as governments are, on the one hand, vested with responsibility for public safety, and on the other, commonly suspected of a penchant for mass surveillance. Evidence for the latter can be found e.g. in the case of former National Security Agency contractee Edward Snowden. It was not until he published his revelations that the public heard about US government agencies using such tools as XkeyScore, Tempora, Bullrun or EdgeHill to track their ‘targets’ almost in real time. What these tools track and look at is, no doubt about it, all sorts of things comprised by the term Big Data: emails, conversations held via Internet messaging and video chat services, people’s activities on social networking sites, their web search queries, etc. The Snowden leaks also showed how ineffective electronic encryption is in protecting private information from accessing it by security services (Zmudzinski, Citation2014).

Nonetheless, generalisations and suggestions that Big Data are dangerous on their own would not be legitimate, and it probably would not be wise to insist on withdrawal from their broad use. Analyses controlled by appropriate algorithms, based primarily on historical data collected by public services themselves and making only limited use of unstructured data, should be seen as an opportunity to enhance safety in a given area. The way Big Data was used e.g. in Case 2 to support New York’s police and fire departments, it did bring the city tangible benefits and added value visible in the declining number of incidents endangering people’s lives.

Meanwhile, statistics produced by IDC indicate that only as little as 0.5% of all the 40 zettabytes of data to be collected by mankind by 2020 are subject to analysis (The Crisis Management Act [of the Parliament of Poland] of April 26, 2007, Citation2011). The forecast sounds fairly optimistic in that it reveals minimal interest in exploiting people’s personal data. And, if this were true, then there would be no good reasons to fear ubiquitous surveillance and Big Data giving birth to a Big Brother.

Conclusions

By way of conclusion, some thought should be given to what method of data exploration and what analytical framework would be best in terms of minimising risks associated with collecting and processing HACE-compliant information. The example of a Big Data processing framework shown in Figure is composed of three tiers corresponding, respectively, to data access and computation (Tier I), data privacy and domain knowledge (Tier II), and Big Data mining algorithms (Tier III) (Wieczorkowski, Citation2014). The concept has received prior treatment from a number of authors (Wieczorkowski, Citation2014)(Tamhane & Sayyad, Citation2014)(Verma & Dey, Citation2015). This approach to Big Data analysis demonstrates that by striking a balance between the urge to collect data, privacy policy constraints and the capabilities of search engines should be able to acquire enormous information resources whose relevance to the decision support area, including support for decision-making in crisis management, will be unquestionable.

Figure 4. A Big Data processing framework: a three-tier structure centred on the Big Data Mining Platform.

Figure 4. A Big Data processing framework: a three-tier structure centred on the Big Data Mining Platform.

The fireman who, back in 1241, warned the people of Krakow against the imminent Tatar threat by blowing his trumpet from the tower of St. Mary’s Church (CitationThe Hejnał) has surprisingly much in common with twenty-first century data analysts who are busy searching through hoards of information on social networking sites to get at the latest news from crisis-stricken areas and find out about their residents’ reactions. Throughout the ages, those in charge of people’ safety have been alike concerned with ways of fending off disasters and mitigating their impact. Social and technological progress should be, however, backed up by responsible and proactive attitudes of decision makers and crisis response personnel toward their mission; something that is the most difficult to make happen, but something we have been amazed to see recently in New York officials and experts (Glisinan & Stephan, Citation2014).

Crisis management should be viewed not so much as a set of techniques and procedures to be applied in case of a specific emergency but, instead, as a complex of activities, involving all available powers and capacities, that are focused on predicting and preventing dangers from materialising (Bertrand & Chris, Citation2002). In light of the above discussion, Big Data appears as one of those technologies that, owing to the likely but not yet thoroughly researched potential inherent in advanced data analytics, could trigger a major change in approaches to deploying computer support for crisis management.

Disclosure statement

No potential conflict of interest was reported by the authors.

References

  • Bertrand, R., & Chris, L. (2002). A new approach to crisis management. [in:]. Journal of Contingencies and Crisis Management, 10, 181–191.
  • Glisinan K., Stephan A. (2014). From compstat to gov 2.0, big data in New York city management. School of Internal and Public Affairs, Case Consortium, Columbia University, New York, NY.
  • Klein, M., & Methlie, L. (1995). Knowledge-based decision support systems with applications in business (p. 112). Chichester-New York: Wiley.
  • Lai, E. (2008, June 23). The ‘640K’ quote won’t go away -- but did Gates really say it?. Computerworld. Retrieved from http://www.computerworld.com/article/2534312/operating-systems/the--640k--quote-won-t-go-away----but-did-gates-really-say-it-.html
  • Nair, M. (2011). Big Data Reservoirs: Getting from Big Data to Valuable Data [Blog]. Retrieved from https://blogs.oracle.com/dataintegration/entry/big_data_reservoirs_getting_from
  • Rożek, T. (2015). Window to the future, Gość Niedzielny. Vol.48, Wydawnictwo Kurii Metropolitalnej, Katowice.
  • Stanek, S. (2007). The development of management information systems engineering. Katowice: Academy of Economics in Katowice.
  • Stanek S., & Drosio S. (2012). A hybrid decision support system for disaster/crisis management. In A. Respicio et al. (Eds), Fusing decision support into the fabric of the context (pp. 279–290). IOS Press, Amsterdam.
  • Stanek S., Namyslo J., & Drosio S. (2013). Developing the functionality of a mobile decision support system. [in:] Journal of Decision Systems: Special Issue on Mobile Decision Support Systems: Addressing Challenges of Real-Time Decision-Making, 22, 53–68.
  • Tamhane D., & Sayyad S. (2014). Big data analysis using HACE theorem. [in:] International Journal of Advanced Research in Computer Engineering & Technology (IJARCET), 4, 18–23.
  • Thakur B., & Mann M. (2014). Data mining for big data: A review, [in:] International ournal ofAdvanced Research in Computer Science and Software Engineering, 4, 469–473.
  • The Crisis Management Act [of the Parliament of Poland] of April 26, 2007. (2011). Journal of Laws 2011, (No. 22, Item 114). Warsaw: The Government Legislation Centre.
  • The Hejnał (http://www.inyourpocket.com/krakow/The-Hejnal_3755f)
  • Verma G., & Dey G. (2015). Big data: A concept of managing huge data, International Journal of Computer Applications, Applications of Computers and Electronic for the Welfare of Rural Masses, 2015(1), 29–32.
  • Wieczorkowski, J. (2014). The use of big data concept in public administration. In Collegium of economic analysis annals, issue 33 (pp. 567–579). Collegium of Economic Analysis, Warsaw: Warsaw School of Economics.
  • Wu, W., Zhu, X., Wu, G., & Ding, W. (2014). Data Mining with Big Data. [in:]. Knowledge and Data Engineering. IEEE Transactions on, 26, 97–107.
  • Zmudzinski, J. (2014). Big Data - opportunities and threats to the security [Bulletin]. Retrieved from http://www.biuletyn.pti.org.pl/biuletyn-pti/bezpieczenstwo-informacji/317-bigdata-szanse-i-zagrozenia-dla-bezpieczenstwa

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.