CP Alternating Poisson Regression (cp_apr)

Note

The cp_apr function defined in cp_apr.py has been promoted to the pyttb namespace.

pyttb.cp_apr(input_tensor: tensor | sptensor, rank: int, algorithm: Literal['mu', 'pdnr', 'pqnr'] = 'mu', stoptol: float = 1e-4, stoptime: float = 1e6, maxiters: int = 1000, init: ktensor | Literal['random'] = 'random', maxinneriters: int = 10, epsDivZero: float = 1e-10, printitn: int = 1, printinneritn: int = 0, kappa: float = 0.01, kappatol: float = 1e-10, epsActive: float = 1e-8, mu0: float = 1e-5, precompinds: bool = True, inexact: bool = True, lbfgsMem: int = 3) Tuple[ktensor, ktensor, Dict][source]

Compute non-negative CP with alternating Poisson regression.

Parameters:
  • input_tensor (pyttb.tensor or pyttb.sptensor)

  • rank (int) – Rank of the decomposition

  • algorithm (str) – in {‘mu’, ‘pdnr, ‘pqnr’}

  • stoptol (float) – Tolerance on overall KKT violation

  • stoptime (float) – Maximum number of seconds to run

  • maxiters (int) – Maximum number of iterations

  • init (str or pyttb.ktensor) – Initial guess

  • maxinneriters (int) – Maximum inner iterations per outer iteration

  • epsDivZero (float) – Safeguard against divide by zero

  • printitn (int) – Print every n outer iterations, 0 for none

  • printinneritn (int) – Print every n inner iterations

  • kappa (int) – MU ALGORITHM PARAMETER: Offset to fix complementary slackness

  • kappatol – MU ALGORITHM PARAMETER: Tolerance on complementary slackness

  • epsActive (float) – PDNR & PQNR ALGORITHM PARAMETER: Bertsekas tolerance for active set

  • mu0 (float) – PDNR ALGORITHM PARAMETER: Initial Damping Parameter

  • precompinds (bool) – PDNR & PQNR ALGORITHM PARAMETER: Precompute sparse tensor indices

  • inexact (bool) – PDNR ALGORITHM PARAMETER: Compute inexact Newton steps

  • lbfgsMem (int) – PQNR ALGORITHM PARAMETER: Precompute sparse tensor indices

Returns:

  • M (pyttb.ktensor) – Resulting ktensor from CP APR

  • Minit (pyttb.ktensor) – Initial Guess

  • output (dict) – Additional output #TODO document this more appropriately