Multilingual knowledge graphs (KGs) provide high-quality textual and relational information for various NLP applications, but they are often incomplete, especially in languages other than English. Previous research has shown that combining KG information in different languages helps Knowledge Graph Completion (KGC), the task of predicting missing relationships between entities, or Knowledge Graph Enhancement (KGE), the task of predicting missing textual information for entities. Although previous efforts have considered KGC and KGE as independent tasks, we hypothesize that they are interdependent and mutually beneficial. To this end, we present KG-TRICK, a novel sequence-to-sequence framework that unifies textual and relational information completion tasks for multilingual KGs. KG-TRICK demonstrates that: i) it is possible to unify KGC and KGE tasks into a single framework, and ii) combining textual information from multiple languages is beneficial to improve the integrity of a KG. As part of our contributions, we also introduce WikiKGE10++, the largest hand-curated benchmark for completing KG textual information, featuring over 25,000 entities in 10 different languages.