pastix.c

PaStiX external functions implementations.

Authors

Mathieu FAVERGEfav.nosp@m.erge@labr.nosp@m.i.fr
Xavier LACOSTElac.nosp@m.oste@labr.nosp@m.i.fr
Pierre RAMETra.nosp@m.met@labr.nosp@m.i.fr
Summary
pastix.cPaStiX external functions implementations.
Defines and Macros
Macros
FORTRAN_CALLCall a fortran function.
pastix.c defines
Macros
Macros
print_onempiPrint a string using processor 0.
Functions
pastix_initParamsets default parameters for iparm and dparm
redispatch_rhsRedistribute right-hand-side member from l2g to newl2g
_indexValue
Variables
value
index
redispatch_rhs
buildUpDoVectBuild UpDownVector from user B vector or computes its to obtain or as the solution.
buildUpdoVect
global2localrhsConverts global right hand side to local right hand side.
global2localpermConverts global permutation (resp.
sizeofsolverComputes the size in memory of the SolverMatrix.
pastix_task_initAllocate and fill-in pastix_data
pastix_print_memory_usageprint memory usage during pastix.
pastix_welcome_printWill print welcome message, options and parameters.
pastix_order_saveSave ordering structures on disk.
pastix_order_loadLoad ordering structures from disk.
pastix_order_prepare_cscCreate a copy of user’s CSC and prepare it for ordering step.
pastix_task_scotchExecute ordering task, with a centralised graph.
dpastix_order_prepare_cscdCreate a copy of user’s CSCd and prepare it for ordering step.
dpastix_task_scotchExecute ordering task, with a distributed graph.
pastix_task_faxSymbolic factorisation.
dpastix_task_faxSymbolic factorisation.
pastix_task_blendDistribution task.
sopalin_check_paramCheck parameters consistency.
pastix_fillin_cscFill in the internal csc based on the user csc and fill in the coeftab structure
pastix_task_sopalinFactorisation, updown and raffinement tasks.
pastix_task_updownUpdown task.
pastix_task_raffReffinement task
pastix_task_cleanCleaning task
pastix_unscale
pastixComputes one to all steps of the resolution of Ax=b linear system, using direct methods.
Examplefrom file simple.c :
dpastixComputes one to all steps of the resolution of Ax=b linear system, using direct methods.
pastix_bindThreadsSet bindtab in pastix_data, it gives for each thread the CPU to bind in to.
pastix_checkMatrix_intCheck the matrix :
pastix_getLocalUnknownNbrReturn the node number in the new distribution computed by blend.
pastix_getLocalNodeNbrReturn the node number in the new distribution computed by blend.
cmpint
pastix_getLocalUnknownLstFill in unknowns with the list of local nodes/clumns.
pastix_getLocalNodeLstFill in nodelst with the list of local nodes/clumns.
pastix_setSchurUnknownListSet the list of unknowns to isolate at the end of the matrix via permutations.
pastix_getSchurLocalNodeNbrCompute the number of nodes in the local part of the Schur.
pastix_getSchurLocalUnkownNbrCompute the number of unknowns in the local part of the Schur.
pastix_getSchurLocalNodeListCompute the list of nodes in the local part of the Schur.
pastix_getSchurLocalUnkownListCompute the list of unknowns in the local part of the Schur.
pastix_getSchurLocalUnknownList
pastix_getSchurLocalUnkownListGive user memory area to store schur in PaStiX.
pastix_setSchurArray
pastix_getSchurGet the Schur complement from PaStiX.
pastix_checkMatrixCheck the matrix :

Defines and Macros

Summary
Macros
FORTRAN_CALLCall a fortran function.
pastix.c defines

Macros

FORTRAN_CALL

Call a fortran function.

Parameters

nameFortran function name.

pastix.c defines

PASTIX_LOGIf defined, start and end of this file functions will be printed on stdout.
COMPUTEIf not defined, PaStiX will not use user’s coefficient.
FORGET_PARTITIONIf defined, PaStiX won’t use Scotch partition.
DUMP_SYMBOLMATRIXWrite the symbol matrix in a postscript format.
RUSTINEIf defined, PaStiX will call symbolRustine.
STR_SIZEThe default size of a string.
TAG_RHSMPI tag used to communicate right-hand-side member.
SCOTCH_STRAT_DIRECTDefault Scotch strategy for the direct solver.
SCOTCH_STRAT_INCOMPDefault Scotch strategy for the incomplete solver.
SCOTCH_STRAT_PERSOParametrisable Scotch strategy for the direct solver, can be set using IPARMS.
PTSCOTCH_STRAT_DIRECTDefault PT-Scotch strategy for the direct solver.
PTSCOTCH_STRAT_INCOMPDefault PT-Scotch strategy for the incomplete solver.
PTSCOTCH_STRAT_PERSOParametrisable PT-Scotch strategy for the direct solver, can be set using IPARMS.

Macros

Summary
Macros
print_onempiPrint a string using processor 0.

Macros

print_onempi

Print a string using processor 0.  Uses printf syntax.

Parameters

fmtFormat string (see printf manual).
...Arguments depending on the format string.

Functions

pastix_initParam

sets default parameters for iparm and dparm

Parameters

iparmtabular of IPARM_SIZE integer parameters.
dparmtabular of DPARM_SIZE double parameters.

redispatch_rhs

Redistribute right-hand-side member from l2g to newl2g

Parameters

nSize of first right-hand-side member
rhsRight-hand-side member
l2glocal to global column numbers
newnNew right-hand-side member size
newrhsNew right-hand-side member
newl2gNew local to global column numbers
commSizeMPI communicator size
commRankMPI rank
commMPI communicator

_indexValue

Variables

value

FLOAT value

index

INT index

redispatch_rhs

int redispatch_rhs(INT n,
FLOAT *rhs,
INT *l2g,
INT newn,
FLOAT *newrhs,
INT *newl2g,
int commSize,
int commRank,
MPI_Comm comm)

buildUpDoVect

Build UpDownVector from user B vector or computes its to obtain or as the solution.  (depending on iparm)

Parameters

pastix_dataPaStiX global data structure.
loc2glob2Global column number of local columns.
bUser right-hand-side member.
pastix_commMPI communicator.

buildUpdoVect

int buildUpdoVect(pastix_data_t *pastix_data,
INT *loc2glob,
FLOAT *b,
MPI_Comm pastix_comm)

global2localrhs

void global2localrhs(INT lN,
FLOAT *lrhs,
FLOAT *grhs,
INT *loc2glob)

Converts global right hand side to local right hand side.

Parameters

lNlocal number of columns.
lrhslocal right hand side.
grhsglobal right hand side.
loc2globglobal index of local columns.

global2localperm

void global2localperm(INT lN,
INT *lperm,
INT *gperm,
INT *loc2glob)

Converts global permutation (resp. reverse permutation) tabular to local permutation (resp. reverse permutation) tabular.

Parameters

lNlocal number of columns.
lpermlocal permutation tabular.
gpermglobal permutation tabular.
loc2globglobal index of local columns.

sizeofsolver

INT sizeofsolver(SolverMatrix *solvptr,
INT *iparm)

Computes the size in memory of the SolverMatrix.

Parameters

solvptraddress of the SolverMatrix

Returns

SolverMatrix size.

pastix_task_init

void pastix_task_init(pastix_data_t **pastix_data,
MPI_Comm pastix_comm,
INT *iparm,
double *dparm)

Allocate and fill-in pastix_data

Parameters

pastix_datastructure to build
pastix_commPaStiX MPI communicator
iparminteger parameters, to fill-in pastix_data
dparmfloating parameters, to fill-in pastix_data

pastix_print_memory_usage

void pastix_print_memory_usage(INT *iparm,
MPI_Comm pastix_comm)

print memory usage during pastix.

Parameters

iparminteger paréameters.
pastix_commPaStiX MPI communicator

pastix_welcome_print

void pastix_welcome_print(pastix_data_t *pastix_data,
INT *colptr,
INT ln)

Will print welcome message, options and parameters.

Parameters

pastix_dataPaStiX data structure
colptrstarting index of each column in the CSC.
nnumber of columns.

pastix_order_save

int pastix_order_save(Order *ordemesh,
SCOTCH_Graph *grafmesh,
int procnum,
INT ncol,
INT *colptr,
INT *rows,
INT strategy)

Save ordering structures on disk.

Parameters

ordemeshScotch ordering structure to save.
grafmeshScotch Graph structure to save.
ncolNumber of column in the CSC
colptrstarting index of each column in row
rowsrow of each element.
valuesvalue of each element.
strategyIO strategy.

pastix_order_load

int pastix_order_load(Order *ordemesh,
SCOTCH_Graph *grafmesh,
int procnum,
INT *ncol,
INT **colptr,
INT **rows,
INT strategy,
MPI_Comm comm)

Load ordering structures from disk.

Parameters

ordemeshScotch ordering structure to save.
grafmeshScotch Graph structure to save.
ncolNumber of column in the CSC
colptrstarting index of each column in row
rowsrow of each element.
valuesvalue of each element.
startegyIO strategy.
commMPI communicator.

pastix_order_prepare_csc

int pastix_order_prepare_csc(pastix_data_t *pastix_data,
INT n,
INT *colptr,
INT *rows)

Create a copy of user’s CSC and prepare it for ordering step.

Symmetrize the graph and removes diagonal coefficients.

Parameters

pastix_dataPaStiX internal data structure
nNumber of column in the CSC.
colptrStart of each column in rows array.
rowsRow number of each non zeros.

pastix_task_scotch

int pastix_task_scotch(pastix_data_t **pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *colptr,
INT *row,
INT *perm,
INT *invp)

Execute ordering task, with a centralised graph.

Free col2 and row2 entries of pastix_data if pastix_task_scotch has already been called.

Set col2, row2 and n2 to a copy of user’s CSC.

Symmetrize this CSC.

Remove diagonal elements from it.

Clean last oredering if it exists.  Depending on IPARM_ORDERING :

  • Calls Scotch ordering,
  • Calls Metis ordering,
  • Uses user oredering,
  • Loads oredering stored on disk in a Scotch format.

Can save computed ordering on disk.

returns compuited ordering into user arrays.

Parameters

pastix_dataPaStiX data structure.
pastix_commPaStiX MPI communicator.
nsize of the matrix/number of vertices.
colptrstarting index of each column in row
rowrow of each element
permpermutation tabular
invpreverse permutation tabular

dpastix_order_prepare_cscd

int dpastix_order_prepare_cscd(pastix_data_t *pastix_data,
INT n,
INT *colptr,
INT *rows,
INT *loc2glob,
MPI_Comm pastix_comm)

Create a copy of user’s CSCd and prepare it for ordering step.

Symmetrize the graph and removes diagonal coefficients.

Symetrize the graph, removes diagonal elements.

Redistribute the CSCd to be abble to give it to scotch if needed.  Indeed, PT-Scotch only allows consecutive columns.  PTS_permtab is the permutation tabular from the user’s distribution to PT-Scotch one, PTS_peritab is the reverse permutation.

Parameters

pastix_dataPaStiX internal data structure
nNumber of column in the CSC.
colptrStart of each column in rows array.
rowsRow number of each non zeros.
loc2globlocal to global column number array.
pastix_commMPI communicator.

dpastix_task_scotch

int dpastix_task_scotch(pastix_data_t **pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *colptr,
INT *row,
INT *perm,
INT *invp,
INT *loc2glob)

Execute ordering task, with a distributed graph.

In LOAD mode, only load the graph from disk.

Else, Clean col2, row2, loc2glob2 and grafmesh if ordering task has been called before.

Build the graph and calls PT-Scotch ordering.

Gather the graph to be abble to perform centralised version of symbolic factorisation.

Save the graph if asked.

Parameters

pastix_dataPaStiX data structure.
pastix_commPaStiX MPI communicator.
nsize of the matrix/number of vertices.
colptrstarting index of each column in row
rowrow of each element
permpermutation tabular
invpreverse permutation tabular
loc2globglobal index of local columns

pastix_task_fax

void pastix_task_fax(pastix_data_t **pastix_data,
MPI_Comm pastix_comm,
INT *perm,
INT *invp,
int flagWinvp)

Symbolic factorisation.

Parameters

pastix_dataPaStiX data structure
pastix_commPaStiX MPI communicator
nSize of the matrix
permpermutation tabular
invpreverse permutation tabular
flagWinvpflag to indicate if we have to print warning concerning perm and invp modification.

dpastix_task_fax

void dpastix_task_fax(pastix_data_t *pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *perm,
INT *loc2glob,
int flagWinvp)

Symbolic factorisation.

Parameters

pastix_dataPaStiX data structure
pastix_commPaStiX MPI communicator
nSize of the local matrix
permlocal permutation tabular
loc2globGlobal number of local columns (NULL if not ditributed)
flagWinvpflag to indicate if we have to print warning concerning perm and invp modification.

pastix_task_blend

void pastix_task_blend(pastix_data_t **pastix_data,
MPI_Comm pastix_comm)

Distribution task.

Parameters

pastix_dataPaStiX data structure
pastix_commPaStiX MPI communicator

sopalin_check_param

int sopalin_check_param(pastix_data_t *pastix_data)

Check parameters consistency.

Parameters

pastix_dataPaStiX data structure.

Return

NO_ERRif no error occured
BADPARAMETER_ERRif Parameters are not correct on one proc.

pastix_fillin_csc

int pastix_fillin_csc(pastix_data_t *pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *colptr,
INT *row,
FLOAT *avals,
FLOAT *b,
INT *loc2glob)

Fill in the internal csc based on the user csc and fill in the coeftab structure

Parameters

pastix_dataPaStiX data structure.
pastix_commPaStiX MPI communicator.
nSize of the matrix.
colptrstarting index of each column in row and avals.
rowrow of each element of the matrix.
avalsvalue of each element of the matrix.
bRight hand side.
loc2globglobal number of local columns, NULL if not distributed.

pastix_task_sopalin

int pastix_task_sopalin(pastix_data_t *pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *colptr,
INT *row,
FLOAT *avals,
FLOAT *b,
INT rhsnbr,
INT *loc2glob)

Factorisation, updown and raffinement tasks.

Parameters

pastix_dataPaStiX data structure.
pastix_commPaStiX MPI communicator.
nSize of the matrix.
colptrstarting index of each column in row and avals.
rowrow of each element of the matrix.
avalsvalue of each element of the matrix.
bRight hand side.
loc2globglobal number of local columns, NULL if not distributed.

pastix_task_updown

void pastix_task_updown(pastix_data_t *pastix_data,
MPI_Comm pastix_comm,
INT n,
FLOAT *b,
INT rhsnbr,
INT *loc2glob)

Updown task.

Parameters

pastix_dataPaStiX data structure.
pastix_commPaStiX MPI communicator.
nMatrix size.
bRight hand side.
loc2globlocal to global column number.

pastix_task_raff

void pastix_task_raff(pastix_data_t *pastix_data,
MPI_Comm pastix_comm,
INT n,
FLOAT *b,
INT rhsnbr,
INT *loc2glob)

Reffinement task

Parameters

pastix_dataPaStiX data structure.
pastix_commPaStiX MPI communicator.
nMatrix size.
bRight hand side.
loc2globlocal to global column number.

pastix_task_clean

void pastix_task_clean(pastix_data_t **pastix_data,
MPI_Comm pastix_comm)

Cleaning task

Parameters

pastix_unscale

void pastix_unscale(pastix_data_t *pastix_data,
INT sym)

pastix

void pastix(pastix_data_t **pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *colptr,
INT *row,
FLOAT *avals,
INT *perm,
INT *invp,
FLOAT *b,
INT rhs,
INT *iparm,
double *dparm)

Computes one to all steps of the resolution of Ax=b linear system, using direct methods.

The matrix is given in CSC format.

Parameters

pastix_dataData used for a step by step execution.
pastix_commMPI communicator which compute the resolution.
nSize of the system.
colptrTabular containing the start of each column in row and avals tabulars.
rowTabular containing the row number for each element sorted by column.
avalsTabular containing the values of each elements sorted by column.
permPermutation tabular for the renumerotation of the unknowns.
invpReverse permutation tabular for the renumerotation of the unknowns.
bRight hand side vector(s).
rhsNumber of right hand side vector(s).
iparmInteger parameters given to pastix.
dparmDouble parameters given to pâstix.
Summary
Examplefrom file simple.c :

Example

from file simple.c :

/\*******************************************\/
/\*    Check Matrix format                  *\/
/\*******************************************\/
/\*
 * Matrix needs :
 *    - to be in fortran numbering
 *    - to have only the lower triangular part in symmetric case
 *    - to have a graph with a symmetric structure in unsymmetric case
 *\/
pastix_checkMatrix(MPI_COMM_WORLD, verbosemode,
           (MTX_ISSYM(type) ? API_SYM_YES : API_SYM_NO),  API_YES,
           ncol, &colptr, &rows, &values, NULL);

/\*******************************************\/
/\* Initialize parameters to default values *\/
/\*******************************************\/
iparm[IPARM_MODIFY_PARAMETER] = API_NO;
pastix(&pastix_data, MPI_COMM_WORLD,
       ncol, colptr, rows, values,
       perm, invp, rhs, 1, iparm, dparm);

/\*******************************************\/
/\*       Customize some parameters         *\/
/\*******************************************\/
iparm[IPARM_THREAD_NBR] = nbthread;
if (MTX_ISSYM(type)) {
  :wiparm[IPARM_SYM]           = API_SYM_YES;
  iparm[IPARM_FACTORIZATION] = API_FACT_LDLT;
}
else{
  iparm[IPARM_SYM]           = API_SYM_NO;
  iparm[IPARM_FACTORIZATION] = API_FACT_LU;
}
iparm[IPARM_START_TASK]          = API_TASK_ORDERING;
iparm[IPARM_END_TASK]            = API_TASK_CLEAN;

/\*******************************************\/
/\*           Save the rhs                  *\/
/\*    (it will be replaced by solution)    *\/
/\*******************************************\/
rhssaved = malloc(ncol*sizeof(pastix_float_t));
memcpy(rhssaved, rhs, ncol*sizeof(pastix_float_t));

/\*******************************************\/
/\*           Call pastix                   *\/
/\*******************************************\/
perm = malloc(ncol*sizeof(pastix_int_t));
invp = malloc(ncol*sizeof(pastix_int_t));

pastix(&pastix_data, MPI_COMM_WORLD,
 ncol, colptr, rows, values,
 perm, invp, rhs, 1, iparm, dparm);

dpastix

void dpastix(pastix_data_t **pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *colptr,
INT *row,
FLOAT *avals,
INT *loc2glob,
INT *perm,
INT *invp,
FLOAT *b,
INT rhs,
INT *iparm,
double *dparm)

Computes one to all steps of the resolution of Ax=b linear system, using direct methods.  Here the matrix is given distributed.

The matrix is given in CSCD format.

Parameters

pastix_dataData used for a step by step execution.
pastix_commMPI communicator which compute the resolution.
nSize of the system.
colptrTabular containing the start of each column in row and avals tabulars.
rowTabular containing the row number for each element sorted by column.
avalsTabular containing the values of each elements sorted by column.
loc2globGlobal column number of the local columns.
permPermutation tabular for the renumerotation of the unknowns.
invpReverse permutation tabular for the renumerotation of the unknowns.
bRight hand side vector(s).
rhsNumber of right hand side vector(s).
iparmInteger parameters given to pastix.
dparmDouble parameters given to pâstix.

pastix_bindThreads

void pastix_bindThreads (pastix_data_t *pastix_data,
INT thrdnbr,
INT *bindtab)

Set bindtab in pastix_data, it gives for each thread the CPU to bind in to. bindtab follows this organisation :

bindtab[threadnum] = cpu to set thread threadnum.

Parameters

pastix_dataStructure de donnée pour l’utilisation step by step
thrdnbrNombre de threads / Taille du tableau
bindtabTableau de correspondance entre chaque thread et coeur de la machine

pastix_checkMatrix_int

INT pastix_checkMatrix_int(MPI_Comm pastix_comm,
INT verb,
INT flagsym,
INT flagcor,
INT n,
INT **colptr,
INT **row,
FLOAT **avals,
INT **loc2glob,
INT dof,
INT flagalloc)

Check the matrix :

  • Renumbers in Fortran numerotation (base 1) if needed (base 0)
  • Check that the matrix contains no doubles, with flagcor == API_YES, correct it.
  • Can scale the matrix if compiled with -DMC64 -DSCALING (untested)
  • Checks the symetry of the graph in non symmetric mode.  With non distributed matrices, with flagcor == API_YES, correct the matrix.
  • sort the CSC.

Parameters

pastix_commPaStiX MPI communicator
verbLevel of prints (API_VERBOSE_[NOT|NO|YES])
flagsymIndicate if the given matrix is symetric (API_SYM_YES or API_SYM_NO)
flagcorIndicate if we permit the function to reallocate the matrix.
nSize of the matrix.
colptrFirst element of each row in row and avals.
rowRow of each element of the matrix.
avalsValue of each element of the matrix.
loc2globGlobal column number of local columns (NULL if not distributed).
dofNumber of degrees of freedom.
flagallocindicate if allocation on CSC uses internal malloc.

pastix_getLocalUnknownNbr

INT pastix_getLocalUnknownNbr(pastix_data_t **pastix_data)

Return the node number in the new distribution computed by blend.  Needs blend to be runned with pastix_data before.

Parameters

pastix_dataData used for a step by step execution.

Returns

Number of local nodes/columns in new distribution.

pastix_getLocalNodeNbr

INT pastix_getLocalNodeNbr(pastix_data_t **pastix_data)

Return the node number in the new distribution computed by blend.  Needs blend to be runned with pastix_data before.

Parameters

pastix_dataData used for a step by step execution.

Returns

Number of local nodes/columns in new distribution.

cmpint

int cmpint(const void *p1,
const void *p2)

pastix_getLocalUnknownLst

INT pastix_getLocalUnknownLst(pastix_data_t **pastix_data,
INT *nodelst)

Fill in unknowns with the list of local nodes/clumns.  Needs nodelst to be allocated with nodenbr*sizeof(pastix_int_t), where nodenbr has been computed by pastix_getLocalUnknownNbr.

Parameters

pastix_dataData used for a step by step execution.
nodelstAn array where to write the list of local nodes/columns.

pastix_getLocalNodeLst

INT pastix_getLocalNodeLst(pastix_data_t **pastix_data,
INT *nodelst)

Fill in nodelst with the list of local nodes/clumns.  Needs nodelst to be allocated with nodenbr*sizeof(pastix_int_t), where nodenbr has been computed by pastix_getLocalNodeNbr.

Parameters

pastix_dataData used for a step by step execution.
nodelstAn array where to write the list of local nodes/columns.

pastix_setSchurUnknownList

INT pastix_setSchurUnknownList(pastix_data_t *pastix_data,
INT n,
INT *list)

Set the list of unknowns to isolate at the end of the matrix via permutations.

Parameters

pastix_dataData used for a step by step execution.
nNumber of unknowns.
listList of unknowns.

pastix_getSchurLocalNodeNbr

INT pastix_getSchurLocalNodeNbr(pastix_data_t *pastix_data,
INT *nodeNbr)

Compute the number of nodes in the local part of the Schur.

Parameters

pastix_dataCommon data structure for PaStiX calls.
nodeNbr(out) Number of nodes in schur (local).

Returns

NO_ERRFor the moment

TODO: Error management.

pastix_getSchurLocalUnkownNbr

INT pastix_getSchurLocalUnkownNbr(pastix_data_t *pastix_data,
INT *unknownNbr)

Compute the number of unknowns in the local part of the Schur.

Parameters

pastix_dataCommon data structure for PaStiX calls.
unknownNbr(out) Number of unknowns in schur (local).

Returns

NO_ERRFor the moment

TODO: Error management.

pastix_getSchurLocalNodeList

INT pastix_getSchurLocalNodeList(pastix_data_t *pastix_data,
INT *nodes)

Compute the list of nodes in the local part of the Schur.

Parameters

pastix_dataCommon data structure for PaStiX calls.
nodes(out) Nodes in schur (local).

Returns

NO_ERRFor the moment

TODO: Error management.

pastix_getSchurLocalUnkownList

Compute the list of unknowns in the local part of the Schur.

Parameters

pastix_dataCommon data structure for PaStiX calls.
unknowns(out) Unknowns in schur (local).

Returns

NO_ERRFor the moment

TODO: Error management.

pastix_getSchurLocalUnknownList

INT pastix_getSchurLocalUnknownList(pastix_data_t *pastix_data,
INT *unknowns)

pastix_getSchurLocalUnkownList

Give user memory area to store schur in PaStiX.

Parameters

pastix_dataCommon data structure for PaStiX calls.
arrayMemory area to store the schur.

Returns

NO_ERRFor the moment

TODO: Error management.

pastix_setSchurArray

INT pastix_setSchurArray(pastix_data_t *pastix_data,
FLOAT *array)

pastix_getSchur

INT pastix_getSchur(pastix_data_t *pastix_data,
FLOAT *schur)

Get the Schur complement from PaStiX.

Schur complement is a dense block in a column scheme.

Parameters

pastix_dataData used for a step by step execution.
schurArray to fill-in with Schur complement.

pastix_checkMatrix

INT pastix_checkMatrix(MPI_Comm pastix_comm,
INT verb,
INT flagsym,
INT flagcor,
INT n,
INT **colptr,
INT **row,
FLOAT **avals,
INT **loc2glob,
INT dof)

Check the matrix :

  • Renumbers in Fortran numerotation (base 1) if needed (base 0)
  • Check that the matrix contains no doubles, with flagcor == API_YES, correct it.
  • Can scale the matrix if compiled with -DMC64 -DSCALING (untested)
  • Checks the symetry of the graph in non symmetric mode.  With non distributed matrices, with flagcor == API_YES, correct the matrix.
  • sort the CSC.

Parameters

pastix_commPaStiX MPI communicator
verbLevel of prints (API_VERBOSE_[NOT|NO|YES])
flagsymIndicate if the given matrix is symetric (API_SYM_YES or API_SYM_NO)
flagcorIndicate if we permit the function to reallocate the matrix.
nSize of the matrix.
colptrFirst element of each row in row and avals.
rowRow of each element of the matrix.
avalsValue of each element of the matrix.
loc2globGlobal column number of local columns (NULL if not distributed).
dofNumber of degrees of freedom.
FLOAT value
INT index
int redispatch_rhs(INT n,
FLOAT *rhs,
INT *l2g,
INT newn,
FLOAT *newrhs,
INT *newl2g,
int commSize,
int commRank,
MPI_Comm comm)
int buildUpdoVect(pastix_data_t *pastix_data,
INT *loc2glob,
FLOAT *b,
MPI_Comm pastix_comm)
void global2localrhs(INT lN,
FLOAT *lrhs,
FLOAT *grhs,
INT *loc2glob)
Converts global right hand side to local right hand side.
void global2localperm(INT lN,
INT *lperm,
INT *gperm,
INT *loc2glob)
Converts global permutation (resp.
INT sizeofsolver(SolverMatrix *solvptr,
INT *iparm)
Computes the size in memory of the SolverMatrix.
void pastix_task_init(pastix_data_t **pastix_data,
MPI_Comm pastix_comm,
INT *iparm,
double *dparm)
Allocate and fill-in pastix_data
void pastix_print_memory_usage(INT *iparm,
MPI_Comm pastix_comm)
print memory usage during pastix.
void pastix_welcome_print(pastix_data_t *pastix_data,
INT *colptr,
INT ln)
Will print welcome message, options and parameters.
int pastix_order_save(Order *ordemesh,
SCOTCH_Graph *grafmesh,
int procnum,
INT ncol,
INT *colptr,
INT *rows,
INT strategy)
Save ordering structures on disk.
int pastix_order_load(Order *ordemesh,
SCOTCH_Graph *grafmesh,
int procnum,
INT *ncol,
INT **colptr,
INT **rows,
INT strategy,
MPI_Comm comm)
Load ordering structures from disk.
int pastix_order_prepare_csc(pastix_data_t *pastix_data,
INT n,
INT *colptr,
INT *rows)
Create a copy of user’s CSC and prepare it for ordering step.
int pastix_task_scotch(pastix_data_t **pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *colptr,
INT *row,
INT *perm,
INT *invp)
Execute ordering task, with a centralised graph.
int dpastix_order_prepare_cscd(pastix_data_t *pastix_data,
INT n,
INT *colptr,
INT *rows,
INT *loc2glob,
MPI_Comm pastix_comm)
Create a copy of user’s CSCd and prepare it for ordering step.
int dpastix_task_scotch(pastix_data_t **pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *colptr,
INT *row,
INT *perm,
INT *invp,
INT *loc2glob)
Execute ordering task, with a distributed graph.
void pastix_task_fax(pastix_data_t **pastix_data,
MPI_Comm pastix_comm,
INT *perm,
INT *invp,
int flagWinvp)
Symbolic factorisation.
void dpastix_task_fax(pastix_data_t *pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *perm,
INT *loc2glob,
int flagWinvp)
Symbolic factorisation.
void pastix_task_blend(pastix_data_t **pastix_data,
MPI_Comm pastix_comm)
Distribution task.
int sopalin_check_param(pastix_data_t *pastix_data)
Check parameters consistency.
int pastix_fillin_csc(pastix_data_t *pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *colptr,
INT *row,
FLOAT *avals,
FLOAT *b,
INT *loc2glob)
Fill in the internal csc based on the user csc and fill in the coeftab structure
int pastix_task_sopalin(pastix_data_t *pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *colptr,
INT *row,
FLOAT *avals,
FLOAT *b,
INT rhsnbr,
INT *loc2glob)
Factorisation, updown and raffinement tasks.
void pastix_task_updown(pastix_data_t *pastix_data,
MPI_Comm pastix_comm,
INT n,
FLOAT *b,
INT rhsnbr,
INT *loc2glob)
Updown task.
void pastix_task_raff(pastix_data_t *pastix_data,
MPI_Comm pastix_comm,
INT n,
FLOAT *b,
INT rhsnbr,
INT *loc2glob)
Reffinement task
void pastix_task_clean(pastix_data_t **pastix_data,
MPI_Comm pastix_comm)
Cleaning task
void pastix_unscale(pastix_data_t *pastix_data,
INT sym)
void pastix(pastix_data_t **pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *colptr,
INT *row,
FLOAT *avals,
INT *perm,
INT *invp,
FLOAT *b,
INT rhs,
INT *iparm,
double *dparm)
Computes one to all steps of the resolution of Ax=b linear system, using direct methods.
A simple example : read the matrix, check it is correct and correct it if needed, then run pastix in one call.
void dpastix(pastix_data_t **pastix_data,
MPI_Comm pastix_comm,
INT n,
INT *colptr,
INT *row,
FLOAT *avals,
INT *loc2glob,
INT *perm,
INT *invp,
FLOAT *b,
INT rhs,
INT *iparm,
double *dparm)
Computes one to all steps of the resolution of Ax=b linear system, using direct methods.
void pastix_bindThreads (pastix_data_t *pastix_data,
INT thrdnbr,
INT *bindtab)
Set bindtab in pastix_data, it gives for each thread the CPU to bind in to.
INT pastix_checkMatrix_int(MPI_Comm pastix_comm,
INT verb,
INT flagsym,
INT flagcor,
INT n,
INT **colptr,
INT **row,
FLOAT **avals,
INT **loc2glob,
INT dof,
INT flagalloc)
Check the matrix :
INT pastix_getLocalUnknownNbr(pastix_data_t **pastix_data)
Return the node number in the new distribution computed by blend.
INT pastix_getLocalNodeNbr(pastix_data_t **pastix_data)
Return the node number in the new distribution computed by blend.
int cmpint(const void *p1,
const void *p2)
INT pastix_getLocalUnknownLst(pastix_data_t **pastix_data,
INT *nodelst)
Fill in unknowns with the list of local nodes/clumns.
INT pastix_getLocalNodeLst(pastix_data_t **pastix_data,
INT *nodelst)
Fill in nodelst with the list of local nodes/clumns.
INT pastix_setSchurUnknownList(pastix_data_t *pastix_data,
INT n,
INT *list)
Set the list of unknowns to isolate at the end of the matrix via permutations.
INT pastix_getSchurLocalNodeNbr(pastix_data_t *pastix_data,
INT *nodeNbr)
Compute the number of nodes in the local part of the Schur.
INT pastix_getSchurLocalUnkownNbr(pastix_data_t *pastix_data,
INT *unknownNbr)
Compute the number of unknowns in the local part of the Schur.
INT pastix_getSchurLocalNodeList(pastix_data_t *pastix_data,
INT *nodes)
Compute the list of nodes in the local part of the Schur.
INT pastix_getSchurLocalUnknownList(pastix_data_t *pastix_data,
INT *unknowns)
INT pastix_setSchurArray(pastix_data_t *pastix_data,
FLOAT *array)
INT pastix_getSchur(pastix_data_t *pastix_data,
FLOAT *schur)
Get the Schur complement from PaStiX.
INT pastix_checkMatrix(MPI_Comm pastix_comm,
INT verb,
INT flagsym,
INT flagcor,
INT n,
INT **colptr,
INT **row,
FLOAT **avals,
INT **loc2glob,
INT dof)
Check the matrix :
void symbolRustine (SymbolMatrix *matrsymb,
SymbolMatrix * const matrsymb2)
DESCRIPTION TO FILL
Close