# Abstract Algebra : Commutativity and Abelian property in Groups and Rings

We have a Group (G, ·) and Ring (R, +, ·) on the set

$R[G] := \left \{ (\alpha_{g})_{g\in G}| \alpha_{g} \in R \forall g \in G \; and\; \alpha_{g} = 0_{R}\; for \;all\; finitely\; many\; g \in G \right \} $

with ralations : $(\alpha_{g})_{g\in G} + (\beta_{g})_{g\in G} := (\alpha_{g} + \beta_{g})_{g\in G} $

and $(\alpha_{x})_{x \in G} · (\beta_{y})_{y \in G} := ( \sum_{x,y \in G, xy=g} \alpha_{x}\beta_{y}).$

Prove that:

a) $( R[G], +, · )$ is a Ring

b) $R$ is commutative and G Abelian, and so is $R[G]$ commutative Ring,

c) $R$ is a Ring with a neutral element, and $R[G]$ is a Ring with Neutrale element.

## Answer

a) We need to check the ring axioms. First we check that summation and multiplication are well defined.

Let $\alpha,\beta\in R[G]$, let $I(\alpha)=\{g\in G\mid \alpha_g\neq0\}$ and $I(\beta)$ similarly defined. Then $(\alpha+\beta)_g:=\alpha_g+\beta_g$ is zero outside of the finite set $I(\alpha)\cup I(\beta)$, hence defines an element of $R[G]$.

Further:

$$(\alpha\cdot \beta)_g = \sum_{x,y\in G , x\cdot y =g}\alpha_x \beta_y = \sum_{x\in I(\alpha), y\in I(\beta) , xy=g}\alpha_x\beta_y$$

since all other elements of the sum are $0$. But this is then a sum over the _finite_ index set $I(\alpha)\times I(\beta)$, hence a finite sum and the elmenet $\sum_{x,y\in G x\cdot y= g} \alpha_g\beta_g$ is a well defined element of $R$. It remains to show that it is non-zero only for finitely many $g$. Here we note that the set:

$$M(\alpha,\beta):=\{x\cdot y\mid x\in I(\alpha) , y\in I(\beta)\}$$

is a finite set, since $I(\alpha)$ and $I(\beta)$ are finite. So the sum defining $(\alpha\cdot \beta)_g$ will only give a non-zero number for finitely many $g$ (as for all $g\notin M(\alpha,\beta)$ every product in the sum defining $(\alpha\cdot\beta)_g$ is zero!).

Next we check that addition forms a group:

(i) Associativity: For $\alpha, \beta,\gamma\in R[G]$ we have

$\qquad [(\alpha+\beta)+ \gamma]_g= (\alpha+\beta)_g+\gamma_g = (\alpha_g+\beta_g)+\gamma_g$

$\qquad \quad \overset{(*)}=\alpha_g+ (\beta_g+\gamma_g) = \alpha_g+(\beta+\gamma)_g=[\alpha+(\beta+\gamma)]_g$

where we used associativity of elements in $R$ in $(*)$.

(ii) Neutral element: Let $e_g:=0$. This clearly is non-zero only for finitely many $g$, so $e\in R[G]$. Then for any $\alpha\in R[G]$:

$$(\alpha+e)_g = \alpha_g+e_g =\alpha_g+0 = \alpha_g$$

and $e$ is a neutral element on the right.

(iii) Existence of inverses: For $\alpha\in R[G]$ define $(-\alpha)_g := -\alpha_g$, note that $(-\alpha)_g\neq0$ iff $\alpha_g\neq0$, so $(-\alpha)_g$ is non-zero only for finitely many $g$ and $(-\alpha)\in R[G]$. Then:

$$[\alpha+ (-\alpha)]_g = \alpha_g+(-\alpha)_g = \alpha_g-\alpha_g =0=e_g$$

and $\alpha_g$ is a right-inverse.

(iv) commutativity of addition: For $\alpha,\beta \in R[G]$ we have:

$$(\alpha+\beta)_g = \alpha_g+\beta_g \overset{(*)}=\beta_g+\alpha_g = (\beta+\alpha)_g$$

where we used commuativity of addition in $R$ in $(*)$.

The calcluation that multiplication is associative is made unwieldy due to the indices. We simplify notation a bit by noting that we may rewrite any sum

$$\sum_{x,y\in G, xy=g}$$

to

$$\sum_{x\in G}$$

and replace the $y$ in the summand by $x^{-1}g$. Now for $\alpha,\beta,\gamma\in R[G]$ check:

$$[(\alpha\cdot\beta)\cdot \gamma]_g = \sum_{x\in G}(\alpha\cdot \beta)_x\gamma_{x^{-1}g} = \sum_{x\in G}\sum_{a\in G}\alpha_a\beta_{a^{-1}x}\gamma_{x^{-1}g} $$

Now we switch the two sums (which we can do since they only fintily many terms are non-zero) and substitue $\tilde x = a$, $\tilde a = a^{-1}x$ to get:

$$=\sum_{\tilde x \in G}\sum_{\tilde a\in G}\alpha_{\tilde x}\beta_{\tilde a}\gamma_{\tilde a^{-1}\tilde x^{-1}g}= \sum_{\tilde x}\alpha_{\tilde x}(\beta\cdot \gamma)_{\tilde x^{-1}g}= [\alpha\cdot(\beta\cdot\gamma)]_g$$

Lastly we must check the distributivity laws, for $\alpha,\beta,\gamma\in R[G]$ we have:

$$[(\alpha+\beta)\cdot \gamma]_g = \sum_{x,y\in G, xy=g}(\alpha_x+\beta_x)\gamma_y= \sum_{x,y\in G, xy=g}\alpha_x\gamma_y+\sum_{x,y\in G, xy=g}\beta_x\gamma_y = [\alpha\cdot\gamma+\beta\cdot\gamma]_g$$

The calculation for

$$[\alpha\cdot (\beta+\gamma)]_g = [\alpha\cdot \beta+\alpha\cdot \gamma]_g$$

is identical.

(b) Let $\alpha,\beta \in R[G]$ then

$$(\alpha\cdot \beta)_g = \sum_{x,y\in G , xy= g}\alpha_x\cdot\beta_y \overset{*}= \sum_{x,y\in G, xy=g}\beta_y\alpha_x \overset{**}= \sum_{y,x\in G, yx=g}\beta_y\alpha_x = (\beta\cdot \alpha)_g$$

where in $*$ we used commutativity of $R$ and in $**$ commutativity of $G$.

(c) Let $1_R$ be the unit of $R$ and $1_G$ the unit of $G$. Then define $N_g:= \begin{cases}1_R & g=1_G\\ 0 &\text{else}\end{cases}$. Then $N$ defines an element of $R[G]$ (being non-zero only for one element) and for any $\alpha \in R[G]$:

$$(N\cdot \alpha)_g = \sum_{x,y\in G, xy=g}N_x\cdot \alpha_y \overset*= \sum_{y\in G, 1_Gy=g} 1_R\cdot \alpha_y \overset{**}= \alpha_g$$

where in $*$ we used that $N_x$ is $0$ whenever $x\neq 1_G$ and in $**$ we used that $1_Gy=g$ iff $y=g$. this shows that $N$ is a left-unit, the same calculation goes through to show that $\alpha\cdot N = \alpha$ and then $N$ is a right-unit.

- answered
- 1262 views
- $47.00

### Related Questions

- Construction Estimate
- Integral of trig functions
- Calculating Speed and Velocity
- Homomorphism
- Differentiate $f(x)=\int_{\tan x}^{0} \frac{\cos t}{1+e^t}dt$
- Sinusodial graph help (electrical)
- Finding values of k for different points of intersection
- Element satisfying cubic equation in degree $5$ extension