Nonconvex minimax problems abound in modern applications. We focus on nonsmooth nonconvex minimax, thus departing from the more common weakly convex/concave and smooth models assumed in the recent literature. We present proximal gradient schemes (parallel and alternating) and show that both methods can be analyzed through a single scheme within a unified framework which relies on expanding a general convergence mechanism for nonconvex nonsmooth optimization problems. In contrast to the current literature which focuses on the complexity of obtaining only nearly approximate stationary solutions, here we derive pointwise global convergence results, and as by-product we prove refined complexity results. Furthemore, our approach allows to expand the scope of minimax problems that can be addressed through the use of Non Euclidean proximal steps, and to extend the convergence and complexity results to this broader setting. This is joint work with Eyal Cohen.