Yahoo Search Búsqueda en la Web

Resultado de búsqueda

  1. hadoop.apache.org › descriptionApache Hadoop

    The Apache® Hadoop® project develops open-source software for reliable, scalable, distributed computing. The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of ...

  2. 4 de mar. de 2024 · A MapReduce job usually splits the input data-set into independent chunks which are processed by the map tasks in a completely parallel manner. The framework sorts the outputs of the maps, which are then input to the reduce tasks. Typically both the input and the output of the job are stored in a file-system.

  3. Apache Hadoop adalah kerangka kerja sumber terbuka yang digunakan untuk menyimpan dan memproses set data besar secara efisien mulai dari ukuran data gigabita hingga petabita. Daripada menggunakan satu komputer besar untuk menyimpan dan memproses data, Hadoop memungkinkan pengklasteran beberapa komputer untuk menganalisis set data besar secara paralel dengan lebih cepat.

  4. www.coursera.org › articles › what-is-hadoopWhat Is Hadoop? | Coursera

    19 de mar. de 2024 · Apache Hadoop is an open-source platform that stores and processes large sets of data. Explore what Hadoop is and its role in big data processing, along with various use cases, the types of professionals who use it, and how you can begin learning ...

  5. 18 de jun. de 2023 · This document describes how to set up and configure a single-node Hadoop installation so that you can quickly perform simple operations using Hadoop MapReduce and the Hadoop Distributed File System (HDFS). Important: all production Hadoop clusters use Kerberos to authenticate callers and secure access to HDFS data as well as restriction access ...

  6. Apache Hadoop 3.2.1在以前的主要发行版本(hadoop-3.2)上进行了许多重大改进。 该版本普遍可用(GA),这意味着它代表了我们认为已经可以投入生产的API稳定性和质量。 总览. 鼓励用户阅读全套发行说明。此页面概述了主要更改。

  7. Apache Hadoop is an open-source software framework developed by Douglas Cutting, then at Yahoo, that provides the highly reliable distributed processing of large data sets using simple programming models. Hadoop overcame the scalability limitations of Nutch, and is built on clusters of commodity computers, providing a cost-effective solution ...

  1. Anuncio

    relacionado con: apache hadoop
  1. Otras búsquedas realizadas