- 👋 Hi, I’m @subhrm
👋
I may be slow to respond.
Software Engineer
Pinned Loading
-
Pytorch Sample code for Linear regre...
Pytorch Sample code for Linear regression using Least Squares. 1import torch
2import numpy as np
34# Generate Synthetic data5N = 500 # Num of samples
-
LLM vs Information Theory.MD
LLM vs Information Theory.MD 1Large Language Models (LLMs) like GPT and others don’t exactly *challenge* information theory in the sense of contradicting it—but they do **raise interesting questions and stretch traditional interpretations**, especially in areas like entropy, compression, meaning, and communication. Here’s a breakdown of how LLMs interact with or challenge the classical concepts of information theory:
23---45### 1. **Meaning vs. Information (Shannon vs. Semantics)** -
-
word-to-word-autoencoder.ipynb
word-to-word-autoencoder.ipynb 1{2"nbformat": 4,
3"nbformat_minor": 0,
4"metadata": {5"colab": { -
-
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.


