Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update energy-efficent-framework.md #320

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions docs/catalog/ai/energy-efficent-framework.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,12 @@ Training an AI model implies a significant carbon footprint. The underlying fram

Typically, AI/ML frameworks built on languages like C/C++ are more energy efficient than those built on other programming languages.

Following provides an interesting view on a [normalized analysis regarding languages and their energy footprint(https://sites.google.com/view/energy-efficiency-languages/results?authuser=0)].

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Following provides an interesting view on a [normalized analysis regarding languages and their energy footprint(https://sites.google.com/view/energy-efficiency-languages/results?authuser=0)].
Following provides an interesting view on a [normalized analysis regarding languages and their energy footprint](https://sites.google.com/view/energy-efficiency-languages/results?authuser=0).


Take into accoutn that in the Cloud for example the processing time of non CPU intensive applications is mostly spent idling because of the strong interservice dependencies that are network related (In other words a request through the network is typically 10000+ times slower than a CPU operation). 

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Take into accoutn that in the Cloud for example the processing time of non CPU intensive applications is mostly spent idling because of the strong interservice dependencies that are network related (In other words a request through the network is typically 10000+ times slower than a CPU operation). 
In the cloud, the processing time for non-CPU-intensive applications is often spent idling due to strong inter-service dependencies that are network-related. In other words, a request over the network is typically more than 10,000 times slower than a CPU operation.


Libraries like TensorFlow or PyTorch allow to leverage GPUs easily that would have a different consumption scheme compared to CPUs, TPU for TensorFlow typically outperform as well CPUs: All these points need to be properly analyzed before selecting a language that might bring drawbacks in terms of staffing skilled people.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Libraries like TensorFlow or PyTorch allow to leverage GPUs easily that would have a different consumption scheme compared to CPUs, TPU for TensorFlow typically outperform as well CPUs: All these points need to be properly analyzed before selecting a language that might bring drawbacks in terms of staffing skilled people.
Libraries like TensorFlow or PyTorch allow easy GPU utilization, which has a different consumption pattern compared to CPUs. TPUs (Tensor Processing Units) in TensorFlow typically outperform CPUs. All these points need proper analysis before selecting a language, as it may impact staffing skilled personnel.



## Solution
Evaluate and select an energy-efficient framework/module for AI/ML development, training and inference.
Expand Down