个人
2025-03-24
3 月 21 号,《科技爱好者周刊(第 342 期):面试的 AI 作弊----用数字人去面试》上看到的观点,和我的想法比较契合。
最重要的:
- 智慧
- 坚持
- 忍受(挫败、孤独、前途迷茫)
人们常常低估耐力的作用,只把它理解成比别人更努力、更持久。
实际上,耐力也是坚守自己的价值观和目标的能力,即使在看上去很难做到的时候,也不放弃。
在缺乏明显进展的情况下,坚持去实现目标的能力,这就是耐力。
在一个充满诱惑分散你注意力的世界中保持专注,克服困难,继续前进,你需要耐力。
耐力是人们可以培养的最有用的品质之一。它比力量、智力、速度、魅力等特质,具有更大的适用性,日常生活很多时候都能用到。
更聪明的人某些时候会表现出色,比你更快地解决难题。但只要凭着耐力,你可以解决更多的问题。
育儿
2025-03-24
女儿自从去年取消了英语班的课程之后,一直在放羊。
虽说是在家指导她学习,但总归是东一榔头、西一棒槌,没有计划,也不成体系。
这里总结和思考如下:
- 之前的英语班课程重新梳理,复习
- 音标书
- 绘本
- 加强听、说、读、写四个方面的提升(女儿现在能读出来,但是听和写就不大行)
- 继续国际音标的拼读训练
- 接入小学三年级的英语课程
- 等这些都搞完了,再考虑制定成体系的学习计划
TODO:
- 每天下班在家准备一个主题的课文
- 女儿每天学习上一周准备的课文,要求:
- 背诵
- 词汇
- 默写
- 音标拼写和默写
AI
2025-03-19
Model Context Protocol(模型上下文协议,简称 MCP)是由 Anthropic 公司于 2024 年末推出的开放标准协议,旨在为大型语言模型(LLM)与外部数据源、工具及系统提供标准化连接接口。其核心目标是解决传统 AI 系统集成复杂、维护困难的问题,通过定义通用规则实现 LLM 与数据库、API、本地文件等资源的即插即用式交互。
我的理解:就是用来向 AI 模型拓展一些功能,比如获取数据、运行程序、发送邮件、订购商品等,将 AI 这个大脑连接上真实世界。
Anthropic 就是做 Claude AI 编程模型的那家公司,设计这套协议用来拓展 AI 模型,比如执行目录查看、编辑文件、文本查找、文本替换、Git 提交、执行代码格式化工具。
但是这套协议可以用来拓展到方方面面,比如将我司的邮件服务、短信服务、AppPush 等提供出去,这样支持 MCP 的 IDE 可以在开发过程中直接发送邮件、短信、应用推送出去了。
最简单的场景,比如执行单元测试之后,将测试结果推送给相关人员。
因为目前只有几个 IDE 支持,但是我们也不一定只能用来做代码开发,直接在里面管理工作理论上也是可行。
还拿邮件功能举例,比如可以开发 MCP 接入自己的客户信息,然后在 AI 交互中安排自动化场景营销任务。
PS:MCP 这样的开放标准肯定是 AI 应用的大势所趋。
flowchart
subgraph APP
AppStart[App 启动]
AppGetTask[接收任务]
AppParse[解析 AI 输出]
AppRun[App 执行任务]
end
subgraph MCP[MCP Server]
McpResource[资源/接口]
McpRun[MCP 执行任务]
end
User --> |提交任务|AppGetTask
AppGetTask --> |请求大模型<BR>系统提示词 + 任务描述|Model[AI 大模型]
AppStart -->|获取信息| McpResource
Model --> AppParse --> AppRun
AppRun <--> McpRun
资源:
- 官方提供了一些实现:https://github.com/modelcontextprotocol/servers
- Cline 的系统提示词:
- https://glama.ai/mcp/clients
- https://glama.ai/mcp/servers
Introducing the Model Context Protocol
2024 年 11 月 25 日
https://www.anthropic.com/news/model-context-protocol
Today, we're open-sourcing the Model Context Protocol (MCP), a new standard for connecting AI assistants to the systems where data lives, including content repositories, business tools, and development environments. Its aim is to help frontier models produce better, more relevant responses.
As AI assistants gain mainstream adoption, the industry has invested heavily in model capabilities, achieving rapid advances in reasoning and quality. Yet even the most sophisticated models are constrained by their isolation from data—trapped behind information silos and legacy systems. Every new data source requires its own custom implementation, making truly connected systems difficult to scale.
MCP addresses this challenge. It provides a universal, open standard for connecting AI systems with data sources, replacing fragmented integrations with a single protocol. The result is a simpler, more reliable way to give AI systems access to the data they need.
Model Context Protocol
The Model Context Protocol is an open standard that enables developers to build secure, two-way connections between their data sources and AI-powered tools. The architecture is straightforward: developers can either expose their data through MCP servers or build AI applications (MCP clients) that connect to these servers.
Today, we're introducing three major components of the Model Context Protocol for developers:
- The Model Context Protocol specification and SDKs
- Local MCP server support in the Claude Desktop apps
- An open-source repository of MCP servers
Claude 3.5 Sonnet is adept at quickly building MCP server implementations, making it easy for organizations and individuals to rapidly connect their most important datasets with a range of AI-powered tools. To help developers start exploring, we’re sharing pre-built MCP servers for popular enterprise systems like Google Drive, Slack, GitHub, Git, Postgres, and Puppeteer.
Early adopters like Block and Apollo have integrated MCP into their systems, while development tools companies including Zed, Replit, Codeium, and Sourcegraph are working with MCP to enhance their platforms—enabling AI agents to better retrieve relevant information to further understand the context around a coding task and produce more nuanced and functional code with fewer attempts.
"At Block, open source is more than a development model—it’s the foundation of our work and a commitment to creating technology that drives meaningful change and serves as a public good for all,” said Dhanji R. Prasanna, Chief Technology Officer at Block. “Open technologies like the Model Context Protocol are the bridges that connect AI to real-world applications, ensuring innovation is accessible, transparent, and rooted in collaboration. We are excited to partner on a protocol and use it to build agentic systems, which remove the burden of the mechanical so people can focus on the creative.”
Instead of maintaining separate connectors for each data source, developers can now build against a standard protocol. As the ecosystem matures, AI systems will maintain context as they move between different tools and datasets, replacing today's fragmented integrations with a more sustainable architecture.
Getting started
Developers can start building and testing MCP connectors today. All Claude.ai plans support connecting MCP servers to the Claude Desktop app.
Claude for Work customers can begin testing MCP servers locally, connecting Claude to internal systems and datasets. We'll soon provide developer toolkits for deploying remote production MCP servers that can serve your entire Claude for Work organization.
To start building:
- Install pre-built MCP servers through the Claude Desktop app
- Follow our quickstart guide to build your first MCP server
- Contribute to our open-source repositories of connectors and implementations
An open community
We’re committed to building MCP as a collaborative, open-source project and ecosystem, and we’re eager to hear your feedback. Whether you’re an AI tool developer, an enterprise looking to leverage existing data, or an early adopter exploring the frontier, we invite you to build the future of context-aware AI together.
Android Linux
2025-03-16

Android 的 Linux 终端应用现已广泛适用于 Pixel 设备,获取方法如下
你需要一部 Pixel 设备、最新的软件更新,以及几分钟的时间。
作者:Andy Walker
2025 年 3 月 7 日
TL;DR(摘要)
- Android 的 Linux 终端应用现已广泛适用于运行 2025 年 3 月更新的 Pixel 设备。
- 该基于 Debian 的环境允许用户随身携带完整的 Linux 实例,尽管仍缺乏一些便捷功能。
去年年底,我们曾报道过,Google 正在开发一款原生的 Linux 终端应用,使智能手机用户能够随身携带桌面级 Linux 发行版。从那时起,我们已经看到该应用随着 Android 15 测试版推送。而现在,随着 2025 年 3 月的 Pixel 更新,这款应用的稳定版本已更广泛地适用于运行最新稳定版 Android 的 Google 手机用户。
在 “设置” > “系统” > “开发者选项” 中启用 Linux 开发环境后,Linux 终端应用的图标会自动出现。当我激活该功能并点击图标时,系统提示我下载 567MB 的文件。
尽管我在 Pixel 8 上首次尝试运行该终端时失败了,但第二次尝试成功了。越过这个小障碍后,我便可以通过应用列表中的快捷方式打开终端,并运行诸如 help
、df
和 free -m
等基础命令。当然,你也可以执行更高级的命令。
值得注意的是,该 Linux 环境基于 Debian,这是最成熟的 Linux 发行版之一。与原生终端应用 Termux 不同,Linux 终端应用是通过 Android 虚拟化框架(AVF) 在虚拟机中运行的。
不过,该 Linux 终端应用仍然缺少一些功能。其中最大的缺陷可能是 不支持 GUI 应用,但正如我们之前进行的《Doom》演示所示,这项功能计划在 Android 16 中推出。
对大多数用户来说,Android 上的 Linux 终端应用或许并不算特别激动人心或具有颠覆性,但对于开发者和高级用户而言,这无疑是一个巨大的进步。它使用户能够在移动设备上运行桌面级 Linux 应用,为各种需求提供便利。
管理
2025-02-10
个人
2025-02-09
一般来说,技术类书籍每一张的内容都需要仔细阅读,每个点都需要理解,甚至通过实践来促进或加深理解。
不过,我需要阅读管理类书籍,比如《可复制的领导力》、《卓有成效的管理者》等,然后写读书笔记。
这里我对我的阅读方法和怎么写读书笔记做一个总结。
- 获取电子书
- 阅读目录,通过章节名称了解内容框架
-
AI 总结书本大意
请帮忙总结一下《卓有成效的管理者》这本书的核心思想、主要观点、内容结构、适用对象。
- AI 并不那么可靠,所以放在自己粗读之后,AI 总结和自己的理解相互印证。
-
阅读别人的读后感、阅读笔记、思维导图
- 快速阅读全书
- 观点与故事
- 按照自己的理解阅读重点
- 总结核心关键词,在书本中搜索,直接跳转到那一段阅读
汽车
2025-01-31
科学 物理
2025-01-24
工业革命之前,世界是漆黑的,灯光极其昂贵,火是唯一的人造光源。
历史上,人造光一直是富人和有权势的人的特权,生产和维护既费力又肮脏,可用性和质量都很差。穷人很难获得人造光,总是生活在黑暗中。
古代房屋在夜晚有蜡烛照明,是巨大财富的标志。当时,蜂蜡制成的优质蜡烛是最好的光源,但天然蜂蜡的供应有限,加上手工制作的繁琐,除了最富有的人之外,其他人都买不到优质蜡烛。
一位作家写道:"打开你的冰箱门,你召唤出的光线比 18 世纪大多数家庭所享受的光线总量还要多。"
后来人们发现,鲸油(鲸鱼的皮下脂肪)是更好的蜡烛材料,燃烧时发出干净、稳定的光,是工业革命早期了最好的照明,但它也非常昂贵。
捕鲸业为世界带来了照明,但也将一些鲸鱼物种推向了灭绝的边缘。仅在 1700 年至 1800 年间,为了得到鲸油,就至少有 300,000 头鲸鱼被屠杀。
1800 年代初,欧洲和美国出现了燃气照明,燃烧煤气来发光。然而,燃气照明的安装和维护费用昂贵,而且有危险。所以,煤气灯一般不用在家里,而用在工商业和大城市的路灯。
煤气灯很亮,比之前的任何灯至少亮 20 倍。使用燃气照明是人类第一次体验明亮的照明。
1846 年,天然气生产的副产品煤焦油(简称煤油)做成灯,用来照明。煤油开始取代鲸油,导致照明成本直线下降,并且燃烧时明亮、无味。
正是因为煤油,夜间第一次变得明亮了,天黑后也能生产和娱乐。
19 世纪后半期,托马斯·爱迪生(Thomas Edison)发明了电灯,电照明的时代从此来临。
DNS Golang
2025-01-18
- https://github.com/mr-karan/doggo
- https://doggo.mrkaran.dev/docs/
安装
$ go install github.com/mr-karan/doggo/cmd/doggo@latest
$ doggo
NAME:
doggo 🐶 DNS Client for Humans
USAGE:
doggo [--] [query options] [arguments...]
VERSION:
unknown - unknown
EXAMPLES:
doggo mrkaran.dev Query a domain using defaults.
doggo mrkaran.dev CNAME Query for a CNAME record.
doggo mrkaran.dev MX @9.9.9.9 Uses a custom DNS resolver.
doggo -q mrkaran.dev -t MX -n 1.1.1.1 Using named arguments.
doggo mrkaran.dev --aa --ad Query with Authoritative Answer and Authenticated Data flags set.
doggo mrkaran.dev --cd --do Query with Checking Disabled and DNSSEC OK flags set.
doggo mrkaran.dev --gp-from Germany Query using Globalping API from a specific location.
FREE FORM ARGUMENTS:
Supply hostnames, query types, and classes without flags. Example:
doggo mrkaran.dev A @1.1.1.1
TRANSPORT OPTIONS:
Specify the protocol with a URL-type scheme.
UDP is used if no scheme is specified.
@udp:// eg: @1.1.1.1 initiates a UDP query to 1.1.1.1:53.
@tcp:// eg: @tcp://1.1.1.1 initiates a TCP query to 1.1.1.1:53.
@https:// eg: @https://cloudflare-dns.com/dns-query initiates a DOH query to Cloudflare via DoH.
@tls:// eg: @tls://1.1.1.1 initiates a DoT query to 1.1.1.1:853.
@sdns:// initiates a DNSCrypt or DoH query using a DNS stamp.
@quic:// initiates a DOQ query.
SUBCOMMANDS:
completions [bash|zsh|fish] Generate the shell completion script for the specified shell.
QUERY OPTIONS:
-q, --query=HOSTNAME Hostname to query the DNS records for (eg mrkaran.dev).
-t, --type=TYPE Type of the DNS Record (A, MX, NS etc).
-n, --nameserver=ADDR Address of a specific nameserver to send queries to (9.9.9.9, 8.8.8.8 etc).
-c, --class=CLASS Network class of the DNS record (IN, CH, HS etc).
-x, --reverse Performs a DNS Lookup for an IPv4 or IPv6 address. Sets the query type and class to PTR and IN respectively.
--any Query all supported DNS record types (A, AAAA, CNAME, MX, NS, PTR, SOA, SRV, TXT, CAA).
RESOLVER OPTIONS:
--strategy=STRATEGY Specify strategy to query nameserver listed in etc/resolv.conf. (all, random, first).
--ndots=INT Specify ndots parameter. Takes value from /etc/resolv.conf if using the system namesever or 1 otherwise.
--search Use the search list defined in resolv.conf. Defaults to true. Set --search=false to disable search list.
--timeout=DURATION Specify timeout for the resolver to return a response (e.g., 5s, 400ms, 1m).
-4, --ipv4 Use IPv4 only.
-6, --ipv6 Use IPv6 only.
--tls-hostname=HOSTNAME Provide a hostname for verification of the certificate if the provided DoT nameserver is an IP.
--skip-hostname-verification Skip TLS Hostname Verification in case of DOT Lookups.
QUERY FLAGS:
--aa Set Authoritative Answer flag.
--ad Set Authenticated Data flag.
--cd Set Checking Disabled flag.
--rd Set Recursion Desired flag (default: true).
--z Set Z flag (reserved for future use).
--do Set DNSSEC OK flag.
OUTPUT OPTIONS:
-J, --json Format the output as JSON.
--short Short output format. Shows only the response section.
--color Defaults to true. Set --color=false to disable colored output.
--debug Enable debug logging.
--time Shows how long the response took from the server.
GLOBALPING OPTIONS:
--gp-from=Germany Query using Globalping API from a specific location.
--gp-limit=INT Limit the number of probes to use from Globalping.
DNS 查询
$ doggo sendcloud.net a @223.5.5.5
NAME TYPE CLASS TTL ADDRESS NAMESERVER
sendcloud.net. A IN 60s 106.75.106.173 223.5.5.5:53
sendcloud.net. A IN 60s 106.75.106.166 223.5.5.5:53
$ doggo sendcloud.net a @223.5.5.5 --json
{
"responses": [
{
"answers": [
{
"name": "sendcloud.net.",
"type": "A",
"class": "IN",
"ttl": "60s",
"address": "106.75.106.173",
"status": "",
"rtt": "67ms",
"nameserver": "223.5.5.5:53"
},
{
"name": "sendcloud.net.",
"type": "A",
"class": "IN",
"ttl": "60s",
"address": "106.75.106.166",
"status": "",
"rtt": "67ms",
"nameserver": "223.5.5.5:53"
}
],
"authorities": null,
"questions": [
{
"name": "sendcloud.net.",
"type": "A",
"class": "IN"
}
]
}
]
}
$ doggo sendcloud.net a @223.5.5.5 --json | jq ".responses[].answers[].address"
"106.75.106.166"
"106.75.106.173"
查反解
$ doggo --reverse 101.44.172.1 @223.5.5.5
NAME TYPE CLASS TTL ADDRESS NAMESERVER
1.172.44.101.in-addr.arpa. PTR IN 300s hwsg1c1.email.engagelab.com. 223.5.5.5:53
$ doggo hwsg1c1.email.engagelab.com. a @223.5.5.5
NAME TYPE CLASS TTL ADDRESS NAMESERVER
hwsg1c1.email.engagelab.com. A IN 600s 101.44.172.1 223.5.5.5:53
Global Ping
$ doggo markjour.com --gp-from Germany,Japan --gp-limit 2
LOCATION NAME TYPE CLASS TTL ADDRESS NAMESERVER
Falkenstein, DE, EU, Hetzner
Online GmbH (AS24940)
markjour.com. A IN 600s 121.42.82.115 private
Osaka, JP, AS, Oracle
Corporation (AS31898)
markjour.com. A IN 600s 121.42.82.115 8.8.8.8
工具 Markdown
2025-01-05
https://github.com/microsoft/markitdown
微软开发的 Python 工具,用于将 Office 文档或者 PDF 文件转换为 Markdown 格式。
markitdown path-to-file.pdf > document.md
markitdown path-to-file.pdf -o document.md
cat path-to-file.pdf | markitdown
from markitdown import MarkItDown
md = MarkItDown()
result = md.convert("test.xlsx")
print(result.text_content)
from markitdown import MarkItDown
from openai import OpenAI
client = OpenAI()
md = MarkItDown(llm_client=client, llm_model="gpt-4o")
result = md.convert("example.jpg")
print(result.text_content)