Package | Description |
---|---|
org.apache.giraph.benchmark |
Package of benchmarks for performance testing and optimization
|
org.apache.giraph.block_app.framework.api |
Interfaces representing full API to the underlying graph processing system.
|
org.apache.giraph.block_app.framework.piece.global_comm.internal |
Reducer and Broadcast Handles internal implementation for automatic handling
of global communication within Pieces, hiding a lot of it's complexities.
|
org.apache.giraph.block_app.library |
Core library of Pieces and Suppliers, providing most common usages.
|
org.apache.giraph.block_app.migration |
Migration utility for transforming standard Giraph applications, into
Block Applications.
|
org.apache.giraph.block_app.reducers |
Common reducer utilities for Block Applications
|
org.apache.giraph.block_app.reducers.array |
Reducers for collecting arrays of objects.
|
org.apache.giraph.block_app.reducers.collect |
Reducers for distributed collection of objects.
|
org.apache.giraph.block_app.reducers.map |
Reducers for collecting map of objects.
|
org.apache.giraph.comm.aggregators |
Package for classes which are used to handle aggregators.
|
org.apache.giraph.master |
Package of all the master related things.
|
org.apache.giraph.reducers |
Package of Giraph reducers.
|
org.apache.giraph.reducers.impl |
Package of Giraph reducers.
|
Modifier and Type | Class and Description |
---|---|
static class |
ReducersBenchmark.TestLongSumReducer
LongSumReducer
|
Modifier and Type | Method and Description |
---|---|
<S,R extends org.apache.hadoop.io.Writable> |
CreateReducersApi.createGlobalReducer(ReduceOperation<S,R> reduceOp)
Create global reducer, returning a handle to it.
|
<S,R extends org.apache.hadoop.io.Writable> |
CreateReducersApi.createGlobalReducer(ReduceOperation<S,R> reduceOp,
R globalInitialValue)
Create global reducer, returning a handle to it.
|
<S,R extends org.apache.hadoop.io.Writable> |
CreateReducersApi.createLocalReducer(ReduceOperation<S,R> reduceOp)
Create local reducer, returning a handle to it.
|
<S,R extends org.apache.hadoop.io.Writable> |
CreateReducersApi.createLocalReducer(ReduceOperation<S,R> reduceOp,
R globalInitialValue)
Create local reducer, returning a handle to it.
|
<S,R extends org.apache.hadoop.io.Writable> |
CreateReducersApi.CreateReducerFunctionApi.createReducer(ReduceOperation<S,R> reduceOp) |
Modifier and Type | Field and Description |
---|---|
protected ReduceOperation<S,R> |
ReducersForPieceHandler.ReduceHandleImpl.reduceOp |
Modifier and Type | Method and Description |
---|---|
<S,R extends org.apache.hadoop.io.Writable> |
ReducersForPieceHandler.createGlobalReducer(MasterGlobalCommUsage master,
ReduceOperation<S,R> reduceOp,
R globalInitialValue) |
<S,R extends org.apache.hadoop.io.Writable> |
CreateReducersApiWrapper.createGlobalReducer(ReduceOperation<S,R> reduceOp) |
<S,R extends org.apache.hadoop.io.Writable> |
CreateReducersApiWrapper.createGlobalReducer(ReduceOperation<S,R> reduceOp,
R globalInitialValue) |
<S,R extends org.apache.hadoop.io.Writable> |
ReducersForPieceHandler.createLocalReducer(MasterGlobalCommUsage master,
ReduceOperation<S,R> reduceOp,
R globalInitialValue) |
<S,R extends org.apache.hadoop.io.Writable> |
CreateReducersApiWrapper.createLocalReducer(ReduceOperation<S,R> reduceOp) |
<S,R extends org.apache.hadoop.io.Writable> |
CreateReducersApiWrapper.createLocalReducer(ReduceOperation<S,R> reduceOp,
R globalInitialValue) |
Constructor and Description |
---|
GlobalReduceHandle(ReduceOperation<S,R> reduceOp) |
LocalReduceHandle(ReduceOperation<S,R> reduceOp) |
WrappedReducedValue(ReduceOperation<?,R> reduceOp,
R value) |
Modifier and Type | Method and Description |
---|---|
<S,R extends org.apache.hadoop.io.Writable> |
SendMessageChain.endReduce(String name,
ReduceOperation<S,R> reduceOp,
FunctionWithVertex<I,V,E,P,S> valueSupplier,
Consumer<R> reducedValueConsumer)
End chain by giving received messages to valueSupplier,
to produce value that should be reduced, and consumed on master
by reducedValueConsumer.
|
<S,R extends org.apache.hadoop.io.Writable> |
SendMessageChain.endReduceWithMaster(String name,
ReduceOperation<S,R> reduceOp,
FunctionWithVertex<I,V,E,P,S> valueSupplier,
PairConsumer<R,BlockMasterApi> reducedValueConsumer)
End chain by giving received messages to valueSupplier,
to produce value that should be reduced, and consumed on master
by reducedValueConsumer.
|
static <S,R extends org.apache.hadoop.io.Writable,I extends org.apache.hadoop.io.WritableComparable,V extends org.apache.hadoop.io.Writable,E extends org.apache.hadoop.io.Writable> |
Pieces.reduce(String name,
ReduceOperation<S,R> reduceOp,
SupplierFromVertex<I,V,E,S> valueSupplier,
Consumer<R> reducedValueConsumer)
Creates single reducer piece - given reduce class, supplier of values on
worker, reduces and passes the result to given consumer on master.
|
static <S,R extends org.apache.hadoop.io.Writable,I extends org.apache.hadoop.io.WritableComparable,V extends org.apache.hadoop.io.Writable,E extends org.apache.hadoop.io.Writable> |
Pieces.reduceAndBroadcast(String name,
ReduceOperation<S,R> reduceOp,
SupplierFromVertex<I,V,E,S> valueSupplier,
ConsumerWithVertex<I,V,E,R> reducedValueConsumer)
Creates single reducer and broadcast piece - given reduce class, supplier
of values on worker, reduces and broadcasts the value, passing it to the
consumer on worker for each vertex.
|
static <S,R extends org.apache.hadoop.io.Writable,I extends org.apache.hadoop.io.WritableComparable,V extends org.apache.hadoop.io.Writable,E extends org.apache.hadoop.io.Writable> |
Pieces.reduceWithMaster(String name,
ReduceOperation<S,R> reduceOp,
SupplierFromVertex<I,V,E,S> valueSupplier,
PairConsumer<R,BlockMasterApi> reducedValueConsumer)
Creates single reducer piece - given reduce class, supplier of values on
worker, reduces and passes the result to given consumer on master.
|
Modifier and Type | Method and Description |
---|---|
static <S,R extends org.apache.hadoop.io.Writable,I extends org.apache.hadoop.io.WritableComparable,V extends org.apache.hadoop.io.Writable,E extends org.apache.hadoop.io.Writable> |
Pieces.reduceAndBroadcastWithArrayOfHandles(String name,
int numHandles,
Supplier<ReduceOperation<S,R>> reduceOp,
SupplierFromVertex<I,V,E,Long> handleHashSupplier,
SupplierFromVertex<I,V,E,S> valueSupplier,
ConsumerWithVertex<I,V,E,R> reducedValueConsumer)
Like reduceAndBroadcast, but uses array of handles for reducers and
broadcasts, to make it feasible and performant when values are large.
|
Modifier and Type | Method and Description |
---|---|
<S,R extends org.apache.hadoop.io.Writable> |
MigrationMasterCompute.registerReducer(String name,
ReduceOperation<S,R> reduceOp) |
<S,R extends org.apache.hadoop.io.Writable> |
MigrationMasterCompute.registerReducer(String name,
ReduceOperation<S,R> reduceOp,
R globalInitialValue) |
Modifier and Type | Class and Description |
---|---|
class |
TopNReduce<S extends Comparable<S>>
Extracts top N largest elements
|
Modifier and Type | Class and Description |
---|---|
class |
ArrayReduce<S,R extends org.apache.hadoop.io.Writable>
One reducer representing reduction of array of individual values.
|
class |
BasicArrayReduce<S,R extends org.apache.hadoop.io.Writable>
Efficient generic primitive array reduce operation.
|
Modifier and Type | Method and Description |
---|---|
static <S,R extends org.apache.hadoop.io.Writable> |
BasicArrayReduce.createArrayHandles(int fixedSize,
PrimitiveTypeOps<R> typeOps,
ReduceOperation<S,R> elementReduceOp,
CreateReducersApi.CreateReducerFunctionApi createFunction)
Registers one new reducer, that will reduce BasicArray,
by reducing individual elements using
elementReduceOp ,
with predefined size. |
static <S,T extends org.apache.hadoop.io.Writable> |
ArrayReduce.createArrayHandles(int fixedSize,
ReduceOperation<S,T> elementReduceOp,
CreateReducersApi.CreateReducerFunctionApi createFunction)
Registers one new reducer, that will reduce array of objects,
by reducing individual elements using
elementReduceOp . |
static <S,R extends org.apache.hadoop.io.Writable> |
BasicArrayReduce.createArrayHandles(PrimitiveTypeOps<R> typeOps,
ReduceOperation<S,R> elementReduceOp,
CreateReducersApi.CreateReducerFunctionApi createFunction)
Registers one new reducer, that will reduce BasicArray,
by reducing individual elements using
elementReduceOp ,
with unbounded size. |
static <S,R extends org.apache.hadoop.io.Writable> |
HugeArrayUtils.createGlobalReducerArrayHandle(int fixedSize,
ReduceOperation<S,R> elementReduceOp,
CreateReducersApi reduceApi)
Create global array of reducers, by splitting the huge array
into NUM_STRIPES number of parts.
|
static <S,R extends org.apache.hadoop.io.Writable> |
HugeArrayUtils.createGlobalReducerArrayHandle(int fixedSize,
ReduceOperation<S,R> elementReduceOp,
CreateReducersApi reduceApi,
int maxNumStripes)
Create global array of reducers, by splitting the huge array
into
maxNumStripes number of parts. |
static <S,R extends org.apache.hadoop.io.Writable> |
BasicArrayReduce.createLocalArrayHandles(int fixedSize,
PrimitiveTypeOps<R> typeOps,
ReduceOperation<S,R> elementReduceOp,
CreateReducersApi reduceApi)
Registers one new local reducer, that will reduce BasicArray,
by reducing individual elements using
elementReduceOp ,
with predefined size. |
static <S,R extends org.apache.hadoop.io.Writable> |
BasicArrayReduce.createLocalArrayHandles(PrimitiveTypeOps<R> typeOps,
ReduceOperation<S,R> elementReduceOp,
CreateReducersApi reduceApi)
Registers one new local reducer, that will reduce BasicArray,
by reducing individual elements using
elementReduceOp ,
with unbounded size. |
Constructor and Description |
---|
ArrayReduce(int fixedSize,
ReduceOperation<S,R> elementReduceOp)
Create ReduceOperation that reduces arrays by reducing individual
elements.
|
BasicArrayReduce(int fixedSize,
PrimitiveTypeOps<R> typeOps,
ReduceOperation<S,R> elementReduceOp)
Create ReduceOperation that reduces BasicArrays by reducing individual
elements, with predefined size.
|
BasicArrayReduce(PrimitiveTypeOps<R> typeOps,
ReduceOperation<S,R> elementReduceOp)
Create ReduceOperation that reduces BasicArrays by reducing individual
elements, with unbounded size.
|
Modifier and Type | Class and Description |
---|---|
class |
CollectPrimitiveReduceOperation<S>
Collect primitive values reduce operation
|
class |
CollectReduceOperation<S>
Collect values reduce operation
|
class |
CollectTuplesOfPrimitivesReduceOperation
Collect tuples of primitive values reduce operation
|
Modifier and Type | Method and Description |
---|---|
ReduceOperation<S,KryoWritableWrapper<List<S>>> |
CollectShardedReducerHandle.createReduceOperation() |
ReduceOperation<List<Object>,KryoWritableWrapper<List<WArrayList>>> |
CollectShardedTuplesOfPrimitivesReducerHandle.createReduceOperation() |
ReduceOperation<S,KryoWritableWrapper<WArrayList<S>>> |
CollectShardedPrimitiveReducerHandle.createReduceOperation() |
abstract ReduceOperation<S,KryoWritableWrapper<R>> |
ShardedReducerHandle.createReduceOperation() |
Modifier and Type | Class and Description |
---|---|
class |
BasicMapReduce<K extends org.apache.hadoop.io.WritableComparable,S,R extends org.apache.hadoop.io.Writable>
Efficient generic primitive map of values reduce operation.
|
Modifier and Type | Method and Description |
---|---|
static <K extends org.apache.hadoop.io.WritableComparable,S,R extends org.apache.hadoop.io.Writable> |
BasicMapReduce.createLocalMapHandles(PrimitiveIdTypeOps<K> keyTypeOps,
PrimitiveTypeOps<R> typeOps,
ReduceOperation<S,R> elementReduceOp,
CreateReducersApi reduceApi)
Registers one new local reducer, that will reduce BasicMap,
by reducing individual elements corresponding to the same key
using
elementReduceOp . |
static <K extends org.apache.hadoop.io.WritableComparable,S,R extends org.apache.hadoop.io.Writable> |
BasicMapReduce.createMapHandles(PrimitiveIdTypeOps<K> keyTypeOps,
PrimitiveTypeOps<R> typeOps,
ReduceOperation<S,R> elementReduceOp,
CreateReducersApi.CreateReducerFunctionApi createFunction)
Registers one new reducer, that will reduce BasicMap,
by reducing individual elements corresponding to the same key
using
elementReduceOp . |
Constructor and Description |
---|
BasicMapReduce(PrimitiveIdTypeOps<K> keyTypeOps,
PrimitiveTypeOps<R> typeOps,
ReduceOperation<S,R> elementReduceOp)
Create ReduceOperation that reduces BasicMaps by reducing individual
elements corresponding to the same key.
|
Modifier and Type | Method and Description |
---|---|
void |
OwnerAggregatorServerData.registerReducer(String name,
ReduceOperation<Object,org.apache.hadoop.io.Writable> reduceOp)
Register a reducer which current worker owns.
|
Modifier and Type | Class and Description |
---|---|
class |
AggregatorReduceOperation<A extends org.apache.hadoop.io.Writable>
Translates aggregation operation to reduce operations.
|
Modifier and Type | Method and Description |
---|---|
<S,R extends org.apache.hadoop.io.Writable> |
MasterAggregatorHandler.registerReducer(String name,
ReduceOperation<S,R> reduceOp) |
<S,R extends org.apache.hadoop.io.Writable> |
MasterGlobalCommHandler.registerReducer(String name,
ReduceOperation<S,R> reduceOp) |
<S,R extends org.apache.hadoop.io.Writable> |
MasterGlobalCommUsageAggregators.registerReducer(String name,
ReduceOperation<S,R> reduceOp)
Register reducer to be reduced in the next worker computation,
using given name and operations.
|
<S,R extends org.apache.hadoop.io.Writable> |
MasterCompute.registerReducer(String name,
ReduceOperation<S,R> reduceOp) |
<S,R extends org.apache.hadoop.io.Writable> |
MasterAggregatorHandler.registerReducer(String name,
ReduceOperation<S,R> reduceOp,
R globalInitialValue) |
<S,R extends org.apache.hadoop.io.Writable> |
MasterGlobalCommHandler.registerReducer(String name,
ReduceOperation<S,R> reduceOp,
R globalInitialValue) |
<S,R extends org.apache.hadoop.io.Writable> |
MasterGlobalCommUsageAggregators.registerReducer(String name,
ReduceOperation<S,R> reduceOp,
R globalInitialValue)
Register reducer to be reduced in the next worker computation, using
given name and operations, starting globally from globalInitialValue.
|
<S,R extends org.apache.hadoop.io.Writable> |
MasterCompute.registerReducer(String name,
ReduceOperation<S,R> reduceOp,
R globalInitialValue) |
Modifier and Type | Class and Description |
---|---|
class |
ReduceSameTypeOperation<R extends org.apache.hadoop.io.Writable>
ReduceOperation object when single object being reduced is of
same type as reduced value.
|
Modifier and Type | Method and Description |
---|---|
ReduceOperation<S,R> |
Reducer.getReduceOp() |
Constructor and Description |
---|
Reducer(ReduceOperation<S,R> reduceOp)
Constructor
|
Reducer(ReduceOperation<S,R> reduceOp,
R currentValue)
Constructor
|
Modifier and Type | Class and Description |
---|---|
class |
AndReduce
ReduceOperation for calculating the AND function over boolean values.
|
class |
KryoWrappedReduceOperation<S,R>
Reduce operation which wraps reduced value in KryoWritableWrapper,
so we don't need to worry about it being writable
|
class |
LongXorReduce
ReduceOperation that XORs (^) values together.
|
class |
MaxPairReducer<L extends org.apache.hadoop.io.Writable,R extends org.apache.hadoop.io.WritableComparable>
Aggregating PairWritable<L, R>, by taking pair with
largest second value.
|
class |
MaxReduce<T extends org.apache.hadoop.io.WritableComparable>
Reducer for calculating max of values
|
class |
MinReduce<T extends org.apache.hadoop.io.WritableComparable>
Reducer for calculating min of values
|
class |
OrReduce
ReduceOperation for calculating the OR function over boolean values.
|
class |
PairReduce<S1,R1 extends org.apache.hadoop.io.Writable,S2,R2 extends org.apache.hadoop.io.Writable>
Combines two individual reducers, to create a single reducer of pairs that
reduces each of them individually.
|
class |
SumReduce<T extends org.apache.hadoop.io.Writable>
Reducer for calculating sum of values
|
Constructor and Description |
---|
PairReduce(ReduceOperation<S1,R1> reduce1,
ReduceOperation<S2,R2> reduce2)
Constructor
|
PairReduce(ReduceOperation<S1,R1> reduce1,
ReduceOperation<S2,R2> reduce2)
Constructor
|
Copyright © 2011-2020 The Apache Software Foundation. All Rights Reserved.