Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run GPU supported LLM inside container with devcontainer #143

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

Kiran1689
Copy link
Contributor

@Kiran1689 Kiran1689 commented Jan 24, 2025

/claim #4

fixes #4

Writer's Checklist

Writing Structure

  • Use short sentences and paragraphs, and include bucket brigades.
  • Include more than two descriptive H2 headings to organize content.
  • Capitalize headings according to the AP Stylebook
    (use this tool)
  • Include an introduction with at least two paragraphs before the first H2
    section.
  • Use appropriate formatting (bold, italic, underline), notes, quotes,
    TLDRs, and key points.
  • Incorporate code elements and Markdown format where appropriate.
  • Ensure at least one visual element per “screen” of the article
    (screenshots, diagrams, tables, graphs, lists, sidenotes, blockquotes).

Fact-Checking

  • Verify all facts and data points included in the article.

Asset Management

  • Save images in the /assets folder.
  • Follow naming conventions:
    YYYYMMDD_title_of_the_article_IMG_NAME_NO.png.
  • (Optional) Create a GitHub repo for the code referenced in the article and
    share it.
  • (Optional) Include a link to this Loom video in the PR comments.

Interlinking

Glossary/Definitions

  • Create new definition in /defitnitions folder.

Review and Edit

  • Ensure articles address the needs of the target audience and their search
    intent.
  • Read the article out loud to catch any awkward phrasing.
  • Run the draft through Grammarly or a similar
    grammar tool.
  • Double-check adherence to the style guide and repository guidelines.
  • Use the name of the article for the title of the PR.

Signed-off-by: Kiran1689 <[email protected]>
Signed-off-by: Kiran1689 <[email protected]>
@Kiran1689
Copy link
Contributor Author

@mojafa Can you please review this? If possible, please assign the issue to me

@mojafa
Copy link
Collaborator

mojafa commented Jan 27, 2025

@Kiran1689 I believe you can use Daytona without spinning up a Linux server/vm on cloud. Specify all the necessary packages and images in a Dockerfile and have it in your devcontainer folder. something like that

Signed-off-by: Kiran1689 <[email protected]>
Signed-off-by: Kiran1689 <[email protected]>
@Kiran1689
Copy link
Contributor Author

Updated @mojafa

@mojafa
Copy link
Collaborator

mojafa commented Jan 28, 2025

@Kiran1689 thansk for incorporating the changes. I'll get a Linux VM tomorrow and test the code.

@Kiran1689
Copy link
Contributor Author

Sure @mojafa
Thanks!

Signed-off-by: Kiran1689 <[email protected]>
@mojafa
Copy link
Collaborator

mojafa commented Jan 31, 2025

@Kiran1689 Thank you so much for your patience and responses. I've finaally manged ot get around to test this PR.

Nice stuff, managed to get the GPU based VM, installed docker, daytona, git, nvidia, nano and vscode. Wow that was a mouthful one!

Was able to run nvidia-smi, daytona create the repo and open my workspace in the Terminal SSH, wrote my hugging face access token, and tested your python samples successfully.

Screenshot 2025-02-01 at 00 43 02 (1)
Screenshot 2025-02-01 at 00 43 02
daytona
Screenshot 2025-02-01 at 00 39 28

Screenshot 2025-02-01 at 00 02 15

Screenshot 2025-02-01 at 00 52 22

@mojafa
Copy link
Collaborator

mojafa commented Jan 31, 2025

As discussed please fix the intros, and the TL;DR sections, also run the article once through Grammarly or something like that! Good stuff man!

Signed-off-by: Kiran1689 <[email protected]>
@Kiran1689
Copy link
Contributor Author

Thanks for your cooperation, @mojafa 🙏

Updated the changes

@mojafa
Copy link
Collaborator

mojafa commented Jan 31, 2025

@nkkko This PR meets the requirements. Please check

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Run GPU supported LLM inside container with devcontainer
2 participants