Spaces:
Sleeping
Implement comprehensive onboarding flow with white theme
Browse files- Replace dark theme with clean white design optimized for PDF reading
- Create integrated onboarding wizard with 5-step process:
* Scope selection (entire paper vs specific section with required details)
* Depth preference (gist, working understanding, reproduce)
* Learning style (concepts, mathematics, methods, figures)
* Chunking approach (AI-generated vs manual highlighting)
* Familiarity assessment with slider and optional context
- Remove redundant /process route and integrate document processing into homepage
- Fix React focus issues with memoized components and local state management
- Add academic background collection for personalized tutoring
- Implement responsive loading states and form validation
- Extract PDF highlight data to external JSON file for modularity
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- backend/socraticai_onboarding.html +387 -0
- frontend/src/App.jsx +1 -10
- frontend/src/components/DocumentProcessor.jsx +62 -2317
- frontend/src/components/DocumentViewer.jsx +2 -3
- frontend/src/components/Homepage.jsx +180 -22
- frontend/src/components/OnboardingWizard.jsx +424 -0
- frontend/src/highlights.json +2260 -0
|
@@ -0,0 +1,387 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
<!doctype html>
|
| 2 |
+
<html lang="en">
|
| 3 |
+
<head>
|
| 4 |
+
<meta charset="utf-8" />
|
| 5 |
+
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
| 6 |
+
<title>SocraticAI – Onboarding Prototype</title>
|
| 7 |
+
<meta name="description" content="Clickable onboarding prototype for SocraticAI" />
|
| 8 |
+
<!-- Tailwind (CDN) -->
|
| 9 |
+
<script src="https://cdn.tailwindcss.com"></script>
|
| 10 |
+
<!-- React 18 + ReactDOM (UMD) -->
|
| 11 |
+
<script crossorigin src="https://unpkg.com/react@18/umd/react.production.min.js"></script>
|
| 12 |
+
<script crossorigin src="https://unpkg.com/react-dom@18/umd/react-dom.production.min.js"></script>
|
| 13 |
+
<!-- Babel for in-browser JSX transform (prototype only) -->
|
| 14 |
+
<script src="https://unpkg.com/@babel/standalone/babel.min.js"></script>
|
| 15 |
+
<style>
|
| 16 |
+
html, body { height: 100%; }
|
| 17 |
+
body { background: #0f172a; } /* slate-900 baseline while tailwind loads */
|
| 18 |
+
</style>
|
| 19 |
+
</head>
|
| 20 |
+
<body>
|
| 21 |
+
<div id="root"></div>
|
| 22 |
+
|
| 23 |
+
<script type="text/babel">
|
| 24 |
+
const { useEffect, useMemo, useRef, useState } = React;
|
| 25 |
+
|
| 26 |
+
// ---- Options ----
|
| 27 |
+
const options = {
|
| 28 |
+
goal: [
|
| 29 |
+
{ key: "key_message", title: "Key Message", desc: "Grasp the main takeaways quickly." },
|
| 30 |
+
{ key: "deep_dive", title: "Deep Dive", desc: "Work through the full paper with tutoring." },
|
| 31 |
+
{ key: "methods", title: "Methods", desc: "Understand the experimental/modeling approach." },
|
| 32 |
+
{ key: "assumptions", title: "Assumptions & Claims", desc: "Surface assumptions, limits, and key claims." },
|
| 33 |
+
{ key: "figure_focus", title: "Figure Focus", desc: "Explain a specific figure step-by-step." },
|
| 34 |
+
],
|
| 35 |
+
guidance: [
|
| 36 |
+
{ key: "guided", title: "Guide me with a structured path", badge: "Recommended", desc: "We’ll sequence the key sections and keep you on track." },
|
| 37 |
+
{ key: "manual", title: "I’ll choose what to discuss", desc: "You decide the order and topics as we go." },
|
| 38 |
+
],
|
| 39 |
+
chunking: [
|
| 40 |
+
{ key: "large", title: "Larger sections", desc: "Faster pace, fewer stops." },
|
| 41 |
+
{ key: "medium", title: "Standard", desc: "Balanced pace and depth." },
|
| 42 |
+
{ key: "small", title: "Short, focused chunks", desc: "Slower pace, more checkpoints." },
|
| 43 |
+
],
|
| 44 |
+
};
|
| 45 |
+
|
| 46 |
+
const StepKeys = {
|
| 47 |
+
GOAL: "goal",
|
| 48 |
+
GUIDANCE: "guidance",
|
| 49 |
+
CHUNKING: "chunking",
|
| 50 |
+
FAMILIARITY: "familiarity",
|
| 51 |
+
};
|
| 52 |
+
|
| 53 |
+
function SocraticAIOnboarding() {
|
| 54 |
+
const [hasFile, setHasFile] = useState(false);
|
| 55 |
+
const [fileName, setFileName] = useState("");
|
| 56 |
+
const fileRef = useRef(null);
|
| 57 |
+
const [showAbout, setShowAbout] = useState(false);
|
| 58 |
+
|
| 59 |
+
const [currentStep, setCurrentStep] = useState(StepKeys.GOAL);
|
| 60 |
+
const [goal, setGoal] = useState(null);
|
| 61 |
+
const [guidance, setGuidance] = useState(null);
|
| 62 |
+
const [chunking, setChunking] = useState("medium");
|
| 63 |
+
const [familiarity, setFamiliarity] = useState(2);
|
| 64 |
+
const [showWizard, setShowWizard] = useState(false);
|
| 65 |
+
const [isLoading, setIsLoading] = useState(false);
|
| 66 |
+
const [loadingPhase, setLoadingPhase] = useState(0);
|
| 67 |
+
|
| 68 |
+
const stepNumber = useMemo(() => {
|
| 69 |
+
switch (currentStep) {
|
| 70 |
+
case StepKeys.GOAL: return 1;
|
| 71 |
+
case StepKeys.GUIDANCE: return 2;
|
| 72 |
+
case StepKeys.CHUNKING: return 3;
|
| 73 |
+
case StepKeys.FAMILIARITY: return 4;
|
| 74 |
+
default: return 1;
|
| 75 |
+
}
|
| 76 |
+
}, [currentStep]);
|
| 77 |
+
|
| 78 |
+
const totalSteps = 4;
|
| 79 |
+
const progressPct = (stepNumber / totalSteps) * 100;
|
| 80 |
+
|
| 81 |
+
useEffect(() => {
|
| 82 |
+
if (!isLoading) return;
|
| 83 |
+
const phases = [1200, 1200, 1400];
|
| 84 |
+
let i = 0;
|
| 85 |
+
const timer = setInterval(() => {
|
| 86 |
+
setLoadingPhase(p => p + 1);
|
| 87 |
+
i += 1;
|
| 88 |
+
if (i >= phases.length) clearInterval(timer);
|
| 89 |
+
}, phases[0]);
|
| 90 |
+
return () => clearInterval(timer);
|
| 91 |
+
}, [isLoading]);
|
| 92 |
+
|
| 93 |
+
function openFilePicker() {
|
| 94 |
+
if (fileRef.current) fileRef.current.click();
|
| 95 |
+
}
|
| 96 |
+
|
| 97 |
+
function fakeAssignFile(f) {
|
| 98 |
+
setHasFile(true);
|
| 99 |
+
setFileName(f?.name || "Demo_Paper.pdf");
|
| 100 |
+
setTimeout(() => setShowAbout(true), 250);
|
| 101 |
+
}
|
| 102 |
+
|
| 103 |
+
function onNext() {
|
| 104 |
+
if (currentStep === StepKeys.GOAL) return setCurrentStep(StepKeys.GUIDANCE);
|
| 105 |
+
if (currentStep === StepKeys.GUIDANCE) {
|
| 106 |
+
if (guidance === "manual") return setCurrentStep(StepKeys.FAMILIARITY);
|
| 107 |
+
return setCurrentStep(StepKeys.CHUNKING);
|
| 108 |
+
}
|
| 109 |
+
if (currentStep === StepKeys.CHUNKING) return setCurrentStep(StepKeys.FAMILIARITY);
|
| 110 |
+
}
|
| 111 |
+
|
| 112 |
+
function onBack() {
|
| 113 |
+
if (currentStep === StepKeys.GOAL) { setShowWizard(false); setShowAbout(true); return; }
|
| 114 |
+
if (currentStep === StepKeys.GUIDANCE) return setCurrentStep(StepKeys.GOAL);
|
| 115 |
+
if (currentStep === StepKeys.CHUNKING) return setCurrentStep(StepKeys.GUIDANCE);
|
| 116 |
+
if (currentStep === StepKeys.FAMILIARITY) return setCurrentStep(guidance === "guided" ? StepKeys.CHUNKING : StepKeys.GUIDANCE);
|
| 117 |
+
}
|
| 118 |
+
|
| 119 |
+
function startWizard() { setShowWizard(true); setCurrentStep(StepKeys.GOAL); }
|
| 120 |
+
function startLoading() { setIsLoading(true); setLoadingPhase(0); }
|
| 121 |
+
|
| 122 |
+
return (
|
| 123 |
+
<div className="min-h-screen w-full bg-gradient-to-br from-slate-950 via-slate-900 to-slate-800 text-slate-100 flex items-center justify-center p-6">
|
| 124 |
+
<div className="max-w-4xl w-full">
|
| 125 |
+
<div className="mb-6 flex items-center justify-between">
|
| 126 |
+
<div className="flex items-center gap-3">
|
| 127 |
+
<div className="h-10 w-10 rounded-2xl bg-indigo-500/20 border border-indigo-400/30 flex items-center justify-center">
|
| 128 |
+
<svg viewBox='0 0 24 24' className='h-6 w-6 text-indigo-300'><path d='M12 3l7.5 4.5v9L12 21 4.5 16.5v-9L12 3z' fill='currentColor'/></svg>
|
| 129 |
+
</div>
|
| 130 |
+
<div>
|
| 131 |
+
<h1 className="text-2xl font-semibold tracking-tight">SocraticAI</h1>
|
| 132 |
+
<p className="text-sm text-slate-400">Guided comprehension for complex papers</p>
|
| 133 |
+
</div>
|
| 134 |
+
</div>
|
| 135 |
+
<div className="text-xs text-slate-400">Prototype · Clickable UI</div>
|
| 136 |
+
</div>
|
| 137 |
+
|
| 138 |
+
{!hasFile && !showAbout && !showWizard && !isLoading && (
|
| 139 |
+
<Landing openFilePicker={openFilePicker} fakeAssignFile={fakeAssignFile} fileRef={fileRef} />
|
| 140 |
+
)}
|
| 141 |
+
|
| 142 |
+
{hasFile && showAbout && !showWizard && !isLoading && (
|
| 143 |
+
<About fileName={fileName} onContinue={startWizard} openFilePicker={openFilePicker} />
|
| 144 |
+
)}
|
| 145 |
+
|
| 146 |
+
{showWizard && !isLoading && (
|
| 147 |
+
<Wizard
|
| 148 |
+
currentStep={currentStep}
|
| 149 |
+
onNext={onNext}
|
| 150 |
+
onBack={onBack}
|
| 151 |
+
goal={goal}
|
| 152 |
+
setGoal={setGoal}
|
| 153 |
+
guidance={guidance}
|
| 154 |
+
setGuidance={setGuidance}
|
| 155 |
+
chunking={chunking}
|
| 156 |
+
setChunking={setChunking}
|
| 157 |
+
familiarity={familiarity}
|
| 158 |
+
setFamiliarity={setFamiliarity}
|
| 159 |
+
stepNumber={stepNumber}
|
| 160 |
+
totalSteps={totalSteps}
|
| 161 |
+
progressPct={progressPct}
|
| 162 |
+
onStart={startLoading}
|
| 163 |
+
/>
|
| 164 |
+
)}
|
| 165 |
+
|
| 166 |
+
{isLoading && <Loading fileName={fileName} phase={loadingPhase} />}
|
| 167 |
+
</div>
|
| 168 |
+
</div>
|
| 169 |
+
);
|
| 170 |
+
}
|
| 171 |
+
|
| 172 |
+
function Landing({ openFilePicker, fakeAssignFile, fileRef }) {
|
| 173 |
+
return (
|
| 174 |
+
<div className="rounded-3xl border border-white/10 bg-white/5 shadow-2xl backdrop-blur p-8 md:p-12">
|
| 175 |
+
<div className="grid md:grid-cols-2 gap-8 items-center">
|
| 176 |
+
<div>
|
| 177 |
+
<h2 className="text-3xl md:text-4xl font-semibold leading-tight">Welcome to SocraticAI</h2>
|
| 178 |
+
<p className="mt-3 text-slate-300">
|
| 179 |
+
Your mentor-like companion for mastering research papers. Upload a paper and we’ll turn it into a guided
|
| 180 |
+
learning path — with questions, feedback, and visible progress.
|
| 181 |
+
</p>
|
| 182 |
+
<div className="mt-8 flex flex-wrap gap-3">
|
| 183 |
+
<button onClick={openFilePicker} className="inline-flex items-center gap-2 rounded-xl px-5 py-3 bg-indigo-500 hover:bg-indigo-600 transition text-white font-medium shadow-lg shadow-indigo-900/30">
|
| 184 |
+
<UploadIcon /> Upload a paper (PDF)
|
| 185 |
+
</button>
|
| 186 |
+
<input ref={fileRef} type="file" accept="application/pdf" className="hidden" onChange={(e) => fakeAssignFile(e.target.files?.[0])} />
|
| 187 |
+
<button onClick={() => fakeAssignFile({ name: "Demo_Paper.pdf" })} className="inline-flex items-center gap-2 rounded-xl px-5 py-3 bg-white/10 hover:bg-white/20 transition text-slate-100 font-medium">
|
| 188 |
+
Try demo paper
|
| 189 |
+
</button>
|
| 190 |
+
</div>
|
| 191 |
+
<p className="mt-3 text-xs text-slate-400">No upload actually leaves your browser in this prototype.</p>
|
| 192 |
+
</div>
|
| 193 |
+
<div className="relative h-56 md:h-72 rounded-2xl bg-gradient-to-tr from-indigo-900/60 via-indigo-700/40 to-cyan-700/40 border border-white/10 overflow-hidden">
|
| 194 |
+
<div className="absolute inset-0 grid grid-cols-3 grid-rows-3 opacity-20">
|
| 195 |
+
{Array.from({ length: 9 }).map((_, i) => <div key={i} className="border border-white/10" />)}
|
| 196 |
+
</div>
|
| 197 |
+
<div className="absolute inset-0 flex items-center justify-center">
|
| 198 |
+
<div className="text-center">
|
| 199 |
+
<div className="text-sm uppercase tracking-wide text-slate-300">Prototype Preview</div>
|
| 200 |
+
<div className="mt-2 text-4xl font-semibold">Onboarding Flow</div>
|
| 201 |
+
</div>
|
| 202 |
+
</div>
|
| 203 |
+
</div>
|
| 204 |
+
</div>
|
| 205 |
+
</div>
|
| 206 |
+
);
|
| 207 |
+
}
|
| 208 |
+
|
| 209 |
+
function About({ fileName, onContinue, openFilePicker }) {
|
| 210 |
+
return (
|
| 211 |
+
<div className="rounded-3xl border border-white/10 bg-white/5 shadow-2xl backdrop-blur p-8 md:p-12">
|
| 212 |
+
<div className="flex items-start gap-4">
|
| 213 |
+
<div className="h-10 w-10 rounded-xl bg-emerald-500/20 border border-emerald-400/30 flex items-center justify-center">
|
| 214 |
+
<CheckIcon />
|
| 215 |
+
</div>
|
| 216 |
+
<div className="flex-1">
|
| 217 |
+
<h3 className="text-2xl font-semibold">Paper added</h3>
|
| 218 |
+
<p className="text-slate-300 mt-1">{fileName}</p>
|
| 219 |
+
<p className="mt-4 text-slate-200">Here’s how SocraticAI helps you learn deeply:</p>
|
| 220 |
+
<ul className="mt-3 text-slate-300 space-y-2 list-disc list-inside">
|
| 221 |
+
<li><span className="font-medium text-slate-200">Automatic inflection points.</span> We flag hypotheses, assumptions, method shifts, and key claims.</li>
|
| 222 |
+
<li><span className="font-medium text-slate-200">Guided micro-conversations.</span> Short question/answer loops validate and extend your understanding.</li>
|
| 223 |
+
<li><span className="font-medium text-slate-200">Visible progress.</span> Confidence builds as you complete focused checkpoints.</li>
|
| 224 |
+
</ul>
|
| 225 |
+
<div className="mt-6 flex flex-wrap gap-3">
|
| 226 |
+
<button onClick={onContinue} className="rounded-xl px-5 py-3 bg-indigo-500 hover:bg-indigo-600 transition text-white font-medium shadow-lg shadow-indigo-900/30">Continue</button>
|
| 227 |
+
<button onClick={openFilePicker} className="rounded-xl px-5 py-3 bg-white/10 hover:bg-white/20 transition text-slate-100 font-medium">Choose another paper</button>
|
| 228 |
+
</div>
|
| 229 |
+
</div>
|
| 230 |
+
</div>
|
| 231 |
+
</div>
|
| 232 |
+
);
|
| 233 |
+
}
|
| 234 |
+
|
| 235 |
+
function Wizard({
|
| 236 |
+
currentStep, onNext, onBack,
|
| 237 |
+
goal, setGoal, guidance, setGuidance,
|
| 238 |
+
chunking, setChunking, familiarity, setFamiliarity,
|
| 239 |
+
stepNumber, totalSteps, progressPct, onStart,
|
| 240 |
+
}) {
|
| 241 |
+
return (
|
| 242 |
+
<div className="rounded-3xl border border-white/10 bg-white/5 shadow-2xl backdrop-blur overflow-hidden">
|
| 243 |
+
<div className="h-2 bg-white/10">
|
| 244 |
+
<div className="h-2 bg-indigo-500" style={{ width: progressPct + '%' }} />
|
| 245 |
+
</div>
|
| 246 |
+
|
| 247 |
+
<div className="p-8 md:p-10">
|
| 248 |
+
<div className="flex items-center justify-between">
|
| 249 |
+
<div className="text-sm text-slate-400">Step {stepNumber} of {totalSteps}</div>
|
| 250 |
+
<div className="text-sm text-slate-400">Onboarding</div>
|
| 251 |
+
</div>
|
| 252 |
+
|
| 253 |
+
<div className="mt-6">
|
| 254 |
+
{currentStep === StepKeys.GOAL && <StepGoal goal={goal} setGoal={setGoal} />}
|
| 255 |
+
{currentStep === StepKeys.GUIDANCE && <StepGuidance guidance={guidance} setGuidance={setGuidance} />}
|
| 256 |
+
{currentStep === StepKeys.CHUNKING && <StepChunking chunking={chunking} setChunking={setChunking} />}
|
| 257 |
+
{currentStep === StepKeys.FAMILIARITY && <StepFamiliarity familiarity={familiarity} setFamiliarity={setFamiliarity} />}
|
| 258 |
+
</div>
|
| 259 |
+
|
| 260 |
+
<div className="mt-8 flex items-center justify-between">
|
| 261 |
+
<button onClick={onBack} className="rounded-xl px-4 py-2 bg-white/5 hover:bg-white/10 border border-white/10 text-slate-200">Back</button>
|
| 262 |
+
|
| 263 |
+
{currentStep !== StepKeys.FAMILIARITY && (
|
| 264 |
+
<button
|
| 265 |
+
onClick={onNext}
|
| 266 |
+
disabled={(currentStep === StepKeys.GOAL && !goal) || (currentStep === StepKeys.GUIDANCE && !guidance)}
|
| 267 |
+
className="rounded-xl px-5 py-2.5 bg-indigo-500 disabled:bg-indigo-900/40 hover:bg-indigo-600 transition text-white font-medium shadow-md"
|
| 268 |
+
>Next</button>
|
| 269 |
+
)}
|
| 270 |
+
|
| 271 |
+
{currentStep === StepKeys.FAMILIARITY && (
|
| 272 |
+
<button onClick={onStart} className="rounded-xl px-5 py-2.5 bg-emerald-500 hover:bg-emerald-600 transition text-white font-medium shadow-md">
|
| 273 |
+
Let’s go — start
|
| 274 |
+
</button>
|
| 275 |
+
)}
|
| 276 |
+
</div>
|
| 277 |
+
</div>
|
| 278 |
+
</div>
|
| 279 |
+
);
|
| 280 |
+
}
|
| 281 |
+
|
| 282 |
+
function StepGoal({ goal, setGoal }) {
|
| 283 |
+
return (
|
| 284 |
+
<div>
|
| 285 |
+
<h3 className="text-2xl font-semibold">What’s your goal for this paper?</h3>
|
| 286 |
+
<p className="mt-2 text-slate-300">Choose the outcome you care about. We’ll tailor the path.</p>
|
| 287 |
+
<div className="mt-6 grid sm:grid-cols-2 lg:grid-cols-3 gap-4">
|
| 288 |
+
{options.goal.map(o => (
|
| 289 |
+
<SelectableCard key={o.key} selected={goal === o.key} onClick={() => setGoal(o.key)} title={o.title} desc={o.desc} />
|
| 290 |
+
))}
|
| 291 |
+
</div>
|
| 292 |
+
</div>
|
| 293 |
+
);
|
| 294 |
+
}
|
| 295 |
+
|
| 296 |
+
function StepGuidance({ guidance, setGuidance }) {
|
| 297 |
+
return (
|
| 298 |
+
<div>
|
| 299 |
+
<h3 className="text-2xl font-semibold">How should we structure your learning path?</h3>
|
| 300 |
+
<p className="mt-2 text-slate-300">Pick the level of guidance that fits how you like to study.</p>
|
| 301 |
+
<div className="mt-6 grid sm:grid-cols-2 gap-4">
|
| 302 |
+
{options.guidance.map(o => (
|
| 303 |
+
<SelectableCard key={o.key} selected={guidance === o.key} onClick={() => setGuidance(o.key)} title={o.title} desc={o.desc} badge={o.badge} />
|
| 304 |
+
))}
|
| 305 |
+
</div>
|
| 306 |
+
{guidance === "manual" && (<p className="mt-4 text-sm text-slate-400">You’ll skip the chunk size step and go straight to familiarity.</p>)}
|
| 307 |
+
</div>
|
| 308 |
+
);
|
| 309 |
+
}
|
| 310 |
+
|
| 311 |
+
function StepChunking({ chunking, setChunking }) {
|
| 312 |
+
return (
|
| 313 |
+
<div>
|
| 314 |
+
<h3 className="text-2xl font-semibold">Preferred chunk size</h3>
|
| 315 |
+
<p className="mt-2 text-slate-300">We’ll slice the paper into checkpoints at the pace you choose.</p>
|
| 316 |
+
<div className="mt-6 grid sm:grid-cols-3 gap-4">
|
| 317 |
+
{options.chunking.map(o => (
|
| 318 |
+
<SelectableCard key={o.key} selected={chunking === o.key} onClick={() => setChunking(o.key)} title={o.title} desc={o.desc} />
|
| 319 |
+
))}
|
| 320 |
+
</div>
|
| 321 |
+
</div>
|
| 322 |
+
);
|
| 323 |
+
}
|
| 324 |
+
|
| 325 |
+
function StepFamiliarity({ familiarity, setFamiliarity }) {
|
| 326 |
+
const labels = ["New to it", "Somewhat new", "Comfortable", "Very familiar", "I’ve taught this"];
|
| 327 |
+
return (
|
| 328 |
+
<div>
|
| 329 |
+
<h3 className="text-2xl font-semibold">How familiar are you with this topic?</h3>
|
| 330 |
+
<p className="mt-2 text-slate-300">This helps us pick the right starting point and vocabulary.</p>
|
| 331 |
+
<div className="mt-8">
|
| 332 |
+
<input type="range" min={0} max={4} step={1} value={familiarity} onChange={e => setFamiliarity(parseInt(e.target.value))} className="w-full accent-indigo-500" />
|
| 333 |
+
<div className="flex justify-between text-xs text-slate-400 mt-2">
|
| 334 |
+
{labels.map((lab, i) => (<div key={i} className={"w-20 " + (i === familiarity ? "text-slate-200" : "")}>{lab}</div>))}
|
| 335 |
+
</div>
|
| 336 |
+
</div>
|
| 337 |
+
</div>
|
| 338 |
+
);
|
| 339 |
+
}
|
| 340 |
+
|
| 341 |
+
function Loading({ fileName, phase }) {
|
| 342 |
+
const messages = ["Preparing your paper…", "Detecting inflection points…", "Setting up your guided path…", "All set. Launching the tutor…"];
|
| 343 |
+
const message = messages[Math.min(phase, messages.length - 1)];
|
| 344 |
+
return (
|
| 345 |
+
<div className="rounded-3xl border border-white/10 bg-white/5 shadow-2xl backdrop-blur p-10 md:p-16 text-center">
|
| 346 |
+
<div className="mx-auto h-16 w-16 rounded-full border-4 border-white/10 border-t-indigo-500 animate-spin" />
|
| 347 |
+
<h3 className="mt-6 text-2xl font-semibold">{message}</h3>
|
| 348 |
+
<p className="mt-2 text-slate-300">{fileName}</p>
|
| 349 |
+
<p className="mt-6 text-xs text-slate-400">(Prototype) This screen loops here — in the real app you’d enter the conversation next.</p>
|
| 350 |
+
</div>
|
| 351 |
+
);
|
| 352 |
+
}
|
| 353 |
+
|
| 354 |
+
function SelectableCard({ selected, onClick, title, desc, badge }) {
|
| 355 |
+
return (
|
| 356 |
+
<button onClick={onClick} className={"text-left rounded-2xl border transition p-4 hover:-translate-y-0.5 active:translate-y-0 bg-white/5 hover:bg-white/10 w-full " + (selected ? "border-indigo-400/60 ring-2 ring-indigo-400/50" : "border-white/10") }>
|
| 357 |
+
<div className="flex items-start justify-between">
|
| 358 |
+
<div className="text-base font-medium">{title}</div>
|
| 359 |
+
{badge && (<span className="text-[10px] uppercase tracking-wide bg-indigo-500/20 text-indigo-200 px-2 py-1 rounded-md border border-indigo-400/30">{badge}</span>)}
|
| 360 |
+
</div>
|
| 361 |
+
{desc && <p className="text-sm text-slate-300 mt-1">{desc}</p>}
|
| 362 |
+
</button>
|
| 363 |
+
);
|
| 364 |
+
}
|
| 365 |
+
|
| 366 |
+
function UploadIcon() {
|
| 367 |
+
return (
|
| 368 |
+
<svg viewBox="0 0 24 24" className="h-5 w-5" fill="none" stroke="currentColor" strokeWidth="1.8">
|
| 369 |
+
<path d="M12 16V4m0 0l-4 4m4-4l4 4"/>
|
| 370 |
+
<path d="M20 16v2a2 2 0 0 1-2 2H6a2 2 0 0 1-2-2v-2"/>
|
| 371 |
+
</svg>
|
| 372 |
+
);
|
| 373 |
+
}
|
| 374 |
+
|
| 375 |
+
function CheckIcon() {
|
| 376 |
+
return (
|
| 377 |
+
<svg viewBox="0 0 24 24" className="h-5 w-5 text-emerald-300" fill="none" stroke="currentColor" strokeWidth="2">
|
| 378 |
+
<path d="M20 6L9 17l-5-5" />
|
| 379 |
+
</svg>
|
| 380 |
+
);
|
| 381 |
+
}
|
| 382 |
+
|
| 383 |
+
const root = ReactDOM.createRoot(document.getElementById("root"));
|
| 384 |
+
root.render(<SocraticAIOnboarding />);
|
| 385 |
+
</script>
|
| 386 |
+
</body>
|
| 387 |
+
</html>
|
|
@@ -1,16 +1,7 @@
|
|
| 1 |
-
import { BrowserRouter as Router, Routes, Route } from 'react-router-dom';
|
| 2 |
import Homepage from './components/Homepage';
|
| 3 |
-
import DocumentProcessor from './components/DocumentProcessor';
|
| 4 |
|
| 5 |
function App() {
|
| 6 |
-
return
|
| 7 |
-
<Router>
|
| 8 |
-
<Routes>
|
| 9 |
-
<Route path="/" element={<Homepage />} />
|
| 10 |
-
<Route path="/process" element={<DocumentProcessor />} />
|
| 11 |
-
</Routes>
|
| 12 |
-
</Router>
|
| 13 |
-
);
|
| 14 |
}
|
| 15 |
|
| 16 |
export default App;
|
|
|
|
|
|
|
| 1 |
import Homepage from './components/Homepage';
|
|
|
|
| 2 |
|
| 3 |
function App() {
|
| 4 |
+
return <Homepage />;
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 5 |
}
|
| 6 |
|
| 7 |
export default App;
|
|
@@ -12,16 +12,51 @@ import DocumentViewer from './DocumentViewer';
|
|
| 12 |
import ChunkPanel from './ChunkPanel';
|
| 13 |
import ProgressBar from './ProgressBar';
|
| 14 |
import WelcomeScreen from './WelcomeScreen';
|
|
|
|
| 15 |
|
| 16 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 17 |
// State for PDF navigation
|
| 18 |
const [pdfNavigation, setPdfNavigation] = useState(null);
|
| 19 |
// State for first LLM response loading
|
| 20 |
const [waitingForFirstResponse, setWaitingForFirstResponse] = useState(false);
|
| 21 |
-
// State for welcome screen visibility
|
| 22 |
-
const [showWelcomeScreen, setShowWelcomeScreen] = useState(
|
| 23 |
// State for document controls (like scrollToPage)
|
| 24 |
const [documentControls, setDocumentControls] = useState(null);
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 25 |
|
| 26 |
// Function to get the page number of the first chunk
|
| 27 |
const getFirstChunkPage = () => {
|
|
@@ -49,17 +84,21 @@ function DocumentProcessor() {
|
|
| 49 |
}, 2000);
|
| 50 |
}
|
| 51 |
};
|
| 52 |
-
|
| 53 |
-
|
| 54 |
-
|
| 55 |
-
|
| 56 |
-
|
| 57 |
-
|
| 58 |
-
|
| 59 |
-
|
| 60 |
-
|
| 61 |
-
|
| 62 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 63 |
|
| 64 |
const {
|
| 65 |
chunkStates,
|
|
@@ -92,2266 +131,7 @@ function DocumentProcessor() {
|
|
| 92 |
|
| 93 |
// Add test preloaded highlights data - keyed by chunk index
|
| 94 |
// Lennart version
|
| 95 |
-
const testPreloadedHighlights =
|
| 96 |
-
0: [{
|
| 97 |
-
"id": "highlight_1755775800949",
|
| 98 |
-
"position": {
|
| 99 |
-
"boundingRect": {
|
| 100 |
-
"x1": 144.01666259765625,
|
| 101 |
-
"y1": 130.43328857421875,
|
| 102 |
-
"x2": 673.6499633789062,
|
| 103 |
-
"y2": 328.26666259765625,
|
| 104 |
-
"width": 816,
|
| 105 |
-
"height": 1056,
|
| 106 |
-
"pageNumber": 2
|
| 107 |
-
},
|
| 108 |
-
"rects": [
|
| 109 |
-
{
|
| 110 |
-
"x1": 144.01666259765625,
|
| 111 |
-
"y1": 130.43328857421875,
|
| 112 |
-
"x2": 672.1999816894531,
|
| 113 |
-
"y2": 146.43328857421875,
|
| 114 |
-
"width": 816,
|
| 115 |
-
"height": 1056,
|
| 116 |
-
"pageNumber": 2
|
| 117 |
-
},
|
| 118 |
-
{
|
| 119 |
-
"x1": 144.01666259765625,
|
| 120 |
-
"y1": 145,
|
| 121 |
-
"x2": 673.316650390625,
|
| 122 |
-
"y2": 161,
|
| 123 |
-
"width": 816,
|
| 124 |
-
"height": 1056,
|
| 125 |
-
"pageNumber": 2
|
| 126 |
-
},
|
| 127 |
-
{
|
| 128 |
-
"x1": 144.01666259765625,
|
| 129 |
-
"y1": 159.4666748046875,
|
| 130 |
-
"x2": 672.1499786376953,
|
| 131 |
-
"y2": 175.4666748046875,
|
| 132 |
-
"width": 816,
|
| 133 |
-
"height": 1056,
|
| 134 |
-
"pageNumber": 2
|
| 135 |
-
},
|
| 136 |
-
{
|
| 137 |
-
"x1": 144.01666259765625,
|
| 138 |
-
"y1": 174.04998779296875,
|
| 139 |
-
"x2": 673.4833374023438,
|
| 140 |
-
"y2": 190.04998779296875,
|
| 141 |
-
"width": 816,
|
| 142 |
-
"height": 1056,
|
| 143 |
-
"pageNumber": 2
|
| 144 |
-
},
|
| 145 |
-
{
|
| 146 |
-
"x1": 144.01666259765625,
|
| 147 |
-
"y1": 188.61663818359375,
|
| 148 |
-
"x2": 280.8000030517578,
|
| 149 |
-
"y2": 204.61663818359375,
|
| 150 |
-
"width": 816,
|
| 151 |
-
"height": 1056,
|
| 152 |
-
"pageNumber": 2
|
| 153 |
-
},
|
| 154 |
-
{
|
| 155 |
-
"x1": 144.01666259765625,
|
| 156 |
-
"y1": 210.48333740234375,
|
| 157 |
-
"x2": 673.2999877929688,
|
| 158 |
-
"y2": 226.48333740234375,
|
| 159 |
-
"width": 816,
|
| 160 |
-
"height": 1056,
|
| 161 |
-
"pageNumber": 2
|
| 162 |
-
},
|
| 163 |
-
{
|
| 164 |
-
"x1": 144.01666259765625,
|
| 165 |
-
"y1": 225.04998779296875,
|
| 166 |
-
"x2": 673.25,
|
| 167 |
-
"y2": 241.04998779296875,
|
| 168 |
-
"width": 816,
|
| 169 |
-
"height": 1056,
|
| 170 |
-
"pageNumber": 2
|
| 171 |
-
},
|
| 172 |
-
{
|
| 173 |
-
"x1": 144.01666259765625,
|
| 174 |
-
"y1": 239.51666259765625,
|
| 175 |
-
"x2": 421.50001525878906,
|
| 176 |
-
"y2": 255.51666259765625,
|
| 177 |
-
"width": 816,
|
| 178 |
-
"height": 1056,
|
| 179 |
-
"pageNumber": 2
|
| 180 |
-
},
|
| 181 |
-
{
|
| 182 |
-
"x1": 443.81666564941406,
|
| 183 |
-
"y1": 239.51666259765625,
|
| 184 |
-
"x2": 672.6500091552734,
|
| 185 |
-
"y2": 255.51666259765625,
|
| 186 |
-
"width": 816,
|
| 187 |
-
"height": 1056,
|
| 188 |
-
"pageNumber": 2
|
| 189 |
-
},
|
| 190 |
-
{
|
| 191 |
-
"x1": 184.1666717529297,
|
| 192 |
-
"y1": 245.08331298828125,
|
| 193 |
-
"x2": 187.2833251953125,
|
| 194 |
-
"y2": 256.28330993652344,
|
| 195 |
-
"width": 816,
|
| 196 |
-
"height": 1056,
|
| 197 |
-
"pageNumber": 2
|
| 198 |
-
},
|
| 199 |
-
{
|
| 200 |
-
"x1": 422.18333435058594,
|
| 201 |
-
"y1": 245.08331298828125,
|
| 202 |
-
"x2": 441.9166717529297,
|
| 203 |
-
"y2": 256.28330993652344,
|
| 204 |
-
"width": 816,
|
| 205 |
-
"height": 1056,
|
| 206 |
-
"pageNumber": 2
|
| 207 |
-
},
|
| 208 |
-
{
|
| 209 |
-
"x1": 144.01666259765625,
|
| 210 |
-
"y1": 254.08331298828125,
|
| 211 |
-
"x2": 673.6499633789062,
|
| 212 |
-
"y2": 270.08331298828125,
|
| 213 |
-
"width": 816,
|
| 214 |
-
"height": 1056,
|
| 215 |
-
"pageNumber": 2
|
| 216 |
-
},
|
| 217 |
-
{
|
| 218 |
-
"x1": 144.01666259765625,
|
| 219 |
-
"y1": 268.66668701171875,
|
| 220 |
-
"x2": 673.3999633789062,
|
| 221 |
-
"y2": 284.66668701171875,
|
| 222 |
-
"width": 816,
|
| 223 |
-
"height": 1056,
|
| 224 |
-
"pageNumber": 2
|
| 225 |
-
},
|
| 226 |
-
{
|
| 227 |
-
"x1": 144.01666259765625,
|
| 228 |
-
"y1": 283.23333740234375,
|
| 229 |
-
"x2": 672.1999969482422,
|
| 230 |
-
"y2": 299.23333740234375,
|
| 231 |
-
"width": 816,
|
| 232 |
-
"height": 1056,
|
| 233 |
-
"pageNumber": 2
|
| 234 |
-
},
|
| 235 |
-
{
|
| 236 |
-
"x1": 144.01666259765625,
|
| 237 |
-
"y1": 297.70001220703125,
|
| 238 |
-
"x2": 673.1000213623047,
|
| 239 |
-
"y2": 313.70001220703125,
|
| 240 |
-
"width": 816,
|
| 241 |
-
"height": 1056,
|
| 242 |
-
"pageNumber": 2
|
| 243 |
-
},
|
| 244 |
-
{
|
| 245 |
-
"x1": 144.01666259765625,
|
| 246 |
-
"y1": 312.26666259765625,
|
| 247 |
-
"x2": 441.433349609375,
|
| 248 |
-
"y2": 328.26666259765625,
|
| 249 |
-
"width": 816,
|
| 250 |
-
"height": 1056,
|
| 251 |
-
"pageNumber": 2
|
| 252 |
-
}
|
| 253 |
-
]
|
| 254 |
-
},
|
| 255 |
-
"content": {
|
| 256 |
-
"text": "Recurrent neural networks, long short-term memory [13] and gated recurrent [7] neural networks\r in particular, have been firmly established as state of the art approaches in sequence modeling and\r transduction problems such as language modeling and machine translation [ 35 , 2 , 5]. Numerous\r efforts have since continued to push the boundaries of recurrent language models and encoder-decoder\r architectures [38, 24, 15].\r Recurrent models typically factor computation along the symbol positions of the input and output\r sequences. Aligning the positions to steps in computation time, they generate a sequence of hidden\r states ht, as a function of the previous hidden state ht−1 and the input for position t. This inherently\r sequential nature precludes parallelization within training examples, which becomes critical at longer\r sequence lengths, as memory constraints limit batching across examples. Recent work has achieved\r significant improvements in computational efficiency through factorization tricks [ 21 ] and conditional\r computation [ 32 ], while also improving model performance in case of the latter. The fundamental\r constraint of sequential computation, however, remains."
|
| 257 |
-
}
|
| 258 |
-
}, {
|
| 259 |
-
"id": "highlight_1755775878721",
|
| 260 |
-
"position": {
|
| 261 |
-
"boundingRect": {
|
| 262 |
-
"x1": 144.01666259765625,
|
| 263 |
-
"y1": 399.6000061035156,
|
| 264 |
-
"x2": 675.6499633789062,
|
| 265 |
-
"y2": 430.1833190917969,
|
| 266 |
-
"width": 816,
|
| 267 |
-
"height": 1056,
|
| 268 |
-
"pageNumber": 2
|
| 269 |
-
},
|
| 270 |
-
"rects": [
|
| 271 |
-
{
|
| 272 |
-
"x1": 144.01666259765625,
|
| 273 |
-
"y1": 399.6000061035156,
|
| 274 |
-
"x2": 673.2999877929688,
|
| 275 |
-
"y2": 415.6000061035156,
|
| 276 |
-
"width": 816,
|
| 277 |
-
"height": 1056,
|
| 278 |
-
"pageNumber": 2
|
| 279 |
-
},
|
| 280 |
-
{
|
| 281 |
-
"x1": 144.01666259765625,
|
| 282 |
-
"y1": 414.1833190917969,
|
| 283 |
-
"x2": 675.6499633789062,
|
| 284 |
-
"y2": 430.1833190917969,
|
| 285 |
-
"width": 816,
|
| 286 |
-
"height": 1056,
|
| 287 |
-
"pageNumber": 2
|
| 288 |
-
}
|
| 289 |
-
]
|
| 290 |
-
},
|
| 291 |
-
"content": {
|
| 292 |
-
"text": "In this work we propose the Transformer, a model architecture eschewing recurrence and instead\r relying entirely on an attention mechanism to draw global dependencies between input and output."
|
| 293 |
-
}
|
| 294 |
-
}],
|
| 295 |
-
1:[{
|
| 296 |
-
"id": "highlight_1755775928209",
|
| 297 |
-
"position": {
|
| 298 |
-
"boundingRect": {
|
| 299 |
-
"x1": 143.5333251953125,
|
| 300 |
-
"y1": 580.7166595458984,
|
| 301 |
-
"x2": 675.6000366210938,
|
| 302 |
-
"y2": 890.38330078125,
|
| 303 |
-
"width": 816,
|
| 304 |
-
"height": 1056,
|
| 305 |
-
"pageNumber": 3
|
| 306 |
-
},
|
| 307 |
-
"rects": [
|
| 308 |
-
{
|
| 309 |
-
"x1": 143.60000610351562,
|
| 310 |
-
"y1": 580.7166595458984,
|
| 311 |
-
"x2": 673.8500061035156,
|
| 312 |
-
"y2": 596.7166595458984,
|
| 313 |
-
"width": 816,
|
| 314 |
-
"height": 1056,
|
| 315 |
-
"pageNumber": 3
|
| 316 |
-
},
|
| 317 |
-
{
|
| 318 |
-
"x1": 144.01666259765625,
|
| 319 |
-
"y1": 595.2833251953125,
|
| 320 |
-
"x2": 674.9166870117188,
|
| 321 |
-
"y2": 611.2833251953125,
|
| 322 |
-
"width": 816,
|
| 323 |
-
"height": 1056,
|
| 324 |
-
"pageNumber": 3
|
| 325 |
-
},
|
| 326 |
-
{
|
| 327 |
-
"x1": 144.01666259765625,
|
| 328 |
-
"y1": 609.7499847412109,
|
| 329 |
-
"x2": 210.3000030517578,
|
| 330 |
-
"y2": 625.7499847412109,
|
| 331 |
-
"width": 816,
|
| 332 |
-
"height": 1056,
|
| 333 |
-
"pageNumber": 3
|
| 334 |
-
},
|
| 335 |
-
{
|
| 336 |
-
"x1": 144.01666259765625,
|
| 337 |
-
"y1": 642.3833160400391,
|
| 338 |
-
"x2": 163.5833282470703,
|
| 339 |
-
"y2": 658.3833160400391,
|
| 340 |
-
"width": 816,
|
| 341 |
-
"height": 1056,
|
| 342 |
-
"pageNumber": 3
|
| 343 |
-
},
|
| 344 |
-
{
|
| 345 |
-
"x1": 173.88333129882812,
|
| 346 |
-
"y1": 642.3833160400391,
|
| 347 |
-
"x2": 337.6999969482422,
|
| 348 |
-
"y2": 658.3833160400391,
|
| 349 |
-
"width": 816,
|
| 350 |
-
"height": 1056,
|
| 351 |
-
"pageNumber": 3
|
| 352 |
-
},
|
| 353 |
-
{
|
| 354 |
-
"x1": 144.01666259765625,
|
| 355 |
-
"y1": 669.0999908447266,
|
| 356 |
-
"x2": 200.46665954589844,
|
| 357 |
-
"y2": 685.0999908447266,
|
| 358 |
-
"width": 816,
|
| 359 |
-
"height": 1056,
|
| 360 |
-
"pageNumber": 3
|
| 361 |
-
},
|
| 362 |
-
{
|
| 363 |
-
"x1": 210.68333435058594,
|
| 364 |
-
"y1": 669.0999908447266,
|
| 365 |
-
"x2": 672.4833526611328,
|
| 366 |
-
"y2": 685.0999908447266,
|
| 367 |
-
"width": 816,
|
| 368 |
-
"height": 1056,
|
| 369 |
-
"pageNumber": 3
|
| 370 |
-
},
|
| 371 |
-
{
|
| 372 |
-
"x1": 144.01666259765625,
|
| 373 |
-
"y1": 683.566650390625,
|
| 374 |
-
"x2": 675.6000366210938,
|
| 375 |
-
"y2": 699.566650390625,
|
| 376 |
-
"width": 816,
|
| 377 |
-
"height": 1056,
|
| 378 |
-
"pageNumber": 3
|
| 379 |
-
},
|
| 380 |
-
{
|
| 381 |
-
"x1": 143.5333251953125,
|
| 382 |
-
"y1": 698.1333160400391,
|
| 383 |
-
"x2": 672.1999969482422,
|
| 384 |
-
"y2": 714.1333160400391,
|
| 385 |
-
"width": 816,
|
| 386 |
-
"height": 1056,
|
| 387 |
-
"pageNumber": 3
|
| 388 |
-
},
|
| 389 |
-
{
|
| 390 |
-
"x1": 144.01666259765625,
|
| 391 |
-
"y1": 712.7166748046875,
|
| 392 |
-
"x2": 672.5500030517578,
|
| 393 |
-
"y2": 728.7166748046875,
|
| 394 |
-
"width": 816,
|
| 395 |
-
"height": 1056,
|
| 396 |
-
"pageNumber": 3
|
| 397 |
-
},
|
| 398 |
-
{
|
| 399 |
-
"x1": 144.01666259765625,
|
| 400 |
-
"y1": 727.2833251953125,
|
| 401 |
-
"x2": 672.8166809082031,
|
| 402 |
-
"y2": 743.2833251953125,
|
| 403 |
-
"width": 816,
|
| 404 |
-
"height": 1056,
|
| 405 |
-
"pageNumber": 3
|
| 406 |
-
},
|
| 407 |
-
{
|
| 408 |
-
"x1": 144.01666259765625,
|
| 409 |
-
"y1": 741.75,
|
| 410 |
-
"x2": 673.3999633789062,
|
| 411 |
-
"y2": 757.75,
|
| 412 |
-
"width": 816,
|
| 413 |
-
"height": 1056,
|
| 414 |
-
"pageNumber": 3
|
| 415 |
-
},
|
| 416 |
-
{
|
| 417 |
-
"x1": 144.01666259765625,
|
| 418 |
-
"y1": 756.316650390625,
|
| 419 |
-
"x2": 350.9166717529297,
|
| 420 |
-
"y2": 772.316650390625,
|
| 421 |
-
"width": 816,
|
| 422 |
-
"height": 1056,
|
| 423 |
-
"pageNumber": 3
|
| 424 |
-
},
|
| 425 |
-
{
|
| 426 |
-
"x1": 378.4499969482422,
|
| 427 |
-
"y1": 756.316650390625,
|
| 428 |
-
"x2": 415.74998474121094,
|
| 429 |
-
"y2": 772.316650390625,
|
| 430 |
-
"width": 816,
|
| 431 |
-
"height": 1056,
|
| 432 |
-
"pageNumber": 3
|
| 433 |
-
},
|
| 434 |
-
{
|
| 435 |
-
"x1": 350.8666534423828,
|
| 436 |
-
"y1": 761.7833251953125,
|
| 437 |
-
"x2": 376.2333221435547,
|
| 438 |
-
"y2": 772.9833221435547,
|
| 439 |
-
"width": 816,
|
| 440 |
-
"height": 1056,
|
| 441 |
-
"pageNumber": 3
|
| 442 |
-
},
|
| 443 |
-
{
|
| 444 |
-
"x1": 144.01666259765625,
|
| 445 |
-
"y1": 787.1499938964844,
|
| 446 |
-
"x2": 197.60000610351562,
|
| 447 |
-
"y2": 803.1499938964844,
|
| 448 |
-
"width": 816,
|
| 449 |
-
"height": 1056,
|
| 450 |
-
"pageNumber": 3
|
| 451 |
-
},
|
| 452 |
-
{
|
| 453 |
-
"x1": 207.9166717529297,
|
| 454 |
-
"y1": 787.1499938964844,
|
| 455 |
-
"x2": 672.4166870117188,
|
| 456 |
-
"y2": 803.1499938964844,
|
| 457 |
-
"width": 816,
|
| 458 |
-
"height": 1056,
|
| 459 |
-
"pageNumber": 3
|
| 460 |
-
},
|
| 461 |
-
{
|
| 462 |
-
"x1": 144.01666259765625,
|
| 463 |
-
"y1": 801.7333374023438,
|
| 464 |
-
"x2": 673.3666381835938,
|
| 465 |
-
"y2": 817.7333374023438,
|
| 466 |
-
"width": 816,
|
| 467 |
-
"height": 1056,
|
| 468 |
-
"pageNumber": 3
|
| 469 |
-
},
|
| 470 |
-
{
|
| 471 |
-
"x1": 144.01666259765625,
|
| 472 |
-
"y1": 816.1999816894531,
|
| 473 |
-
"x2": 673.2666625976562,
|
| 474 |
-
"y2": 832.1999816894531,
|
| 475 |
-
"width": 816,
|
| 476 |
-
"height": 1056,
|
| 477 |
-
"pageNumber": 3
|
| 478 |
-
},
|
| 479 |
-
{
|
| 480 |
-
"x1": 144.01666259765625,
|
| 481 |
-
"y1": 830.7666625976562,
|
| 482 |
-
"x2": 673.3833618164062,
|
| 483 |
-
"y2": 846.7666625976562,
|
| 484 |
-
"width": 816,
|
| 485 |
-
"height": 1056,
|
| 486 |
-
"pageNumber": 3
|
| 487 |
-
},
|
| 488 |
-
{
|
| 489 |
-
"x1": 144.01666259765625,
|
| 490 |
-
"y1": 845.3499755859375,
|
| 491 |
-
"x2": 673.2833251953125,
|
| 492 |
-
"y2": 861.3499755859375,
|
| 493 |
-
"width": 816,
|
| 494 |
-
"height": 1056,
|
| 495 |
-
"pageNumber": 3
|
| 496 |
-
},
|
| 497 |
-
{
|
| 498 |
-
"x1": 144.01666259765625,
|
| 499 |
-
"y1": 859.9166564941406,
|
| 500 |
-
"x2": 673.3666381835938,
|
| 501 |
-
"y2": 875.9166564941406,
|
| 502 |
-
"width": 816,
|
| 503 |
-
"height": 1056,
|
| 504 |
-
"pageNumber": 3
|
| 505 |
-
},
|
| 506 |
-
{
|
| 507 |
-
"x1": 144.01666259765625,
|
| 508 |
-
"y1": 874.38330078125,
|
| 509 |
-
"x2": 608.8999938964844,
|
| 510 |
-
"y2": 890.38330078125,
|
| 511 |
-
"width": 816,
|
| 512 |
-
"height": 1056,
|
| 513 |
-
"pageNumber": 3
|
| 514 |
-
}
|
| 515 |
-
]
|
| 516 |
-
},
|
| 517 |
-
"content": {
|
| 518 |
-
"text": "The Transformer follows this overall architecture using stacked self-attention and point-wise, fully\r connected layers for both the encoder and decoder, shown in the left and right halves of Figure 1,\r respectively.\r 3.1 Encoder and Decoder Stacks\r Encoder: The encoder is composed of a stack of N = 6 identical layers. Each layer has two\r sub-layers. The first is a multi-head self-attention mechanism, and the second is a simple, position-\r wise fully connected feed-forward network. We employ a residual connection [ 11 ] around each of\r the two sub-layers, followed by layer normalization [1]. That is, the output of each sub-layer is\r LayerNorm(x + Sublayer(x)), where Sublayer(x) is the function implemented by the sub-layer\r itself. To facilitate these residual connections, all sub-layers in the model, as well as the embedding\r layers, produce outputs of dimension dmodel = 512.\r Decoder: The decoder is also composed of a stack of N = 6 identical layers. In addition to the two\r sub-layers in each encoder layer, the decoder inserts a third sub-layer, which performs multi-head\r attention over the output of the encoder stack. Similar to the encoder, we employ residual connections\r around each of the sub-layers, followed by layer normalization. We also modify the self-attention\r sub-layer in the decoder stack to prevent positions from attending to subsequent positions. This\r masking, combined with fact that the output embeddings are offset by one position, ensures that the\r predictions for position i can depend only on the known outputs at positions less than i."
|
| 519 |
-
}
|
| 520 |
-
}],
|
| 521 |
-
2: [{
|
| 522 |
-
"id": "highlight_1755777477064",
|
| 523 |
-
"position": {
|
| 524 |
-
"boundingRect": {
|
| 525 |
-
"x1": 143.53334045410156,
|
| 526 |
-
"y1": 933.2333068847656,
|
| 527 |
-
"x2": 674.8833160400391,
|
| 528 |
-
"y2": 964.6999816894531,
|
| 529 |
-
"width": 816,
|
| 530 |
-
"height": 1056,
|
| 531 |
-
"pageNumber": 3
|
| 532 |
-
},
|
| 533 |
-
"rects": [
|
| 534 |
-
{
|
| 535 |
-
"x1": 143.53334045410156,
|
| 536 |
-
"y1": 933.2333068847656,
|
| 537 |
-
"x2": 674.8833160400391,
|
| 538 |
-
"y2": 950.2333068847656,
|
| 539 |
-
"width": 816,
|
| 540 |
-
"height": 1056,
|
| 541 |
-
"pageNumber": 3
|
| 542 |
-
},
|
| 543 |
-
{
|
| 544 |
-
"x1": 143.53334045410156,
|
| 545 |
-
"y1": 947.6999816894531,
|
| 546 |
-
"x2": 673.2666778564453,
|
| 547 |
-
"y2": 964.6999816894531,
|
| 548 |
-
"width": 816,
|
| 549 |
-
"height": 1056,
|
| 550 |
-
"pageNumber": 3
|
| 551 |
-
}
|
| 552 |
-
]
|
| 553 |
-
},
|
| 554 |
-
"content": {
|
| 555 |
-
"text": "An attention function can be described as mapping a query and a set of key-value pairs to an output,\r where the query, keys, values, and output are all vectors. The output is computed as a weighted sum"
|
| 556 |
-
}
|
| 557 |
-
}, {
|
| 558 |
-
"id": "highlight_1755777635270",
|
| 559 |
-
"position": {
|
| 560 |
-
"boundingRect": {
|
| 561 |
-
"x1": 143.36666870117188,
|
| 562 |
-
"y1": 420.75,
|
| 563 |
-
"x2": 674.5166625976562,
|
| 564 |
-
"y2": 825.8000183105469,
|
| 565 |
-
"width": 816,
|
| 566 |
-
"height": 1056,
|
| 567 |
-
"pageNumber": 4
|
| 568 |
-
},
|
| 569 |
-
"rects": [
|
| 570 |
-
{
|
| 571 |
-
"x1": 144.0166778564453,
|
| 572 |
-
"y1": 420.75,
|
| 573 |
-
"x2": 673.2666778564453,
|
| 574 |
-
"y2": 437.75,
|
| 575 |
-
"width": 816,
|
| 576 |
-
"height": 1056,
|
| 577 |
-
"pageNumber": 4
|
| 578 |
-
},
|
| 579 |
-
{
|
| 580 |
-
"x1": 144.0166778564453,
|
| 581 |
-
"y1": 435.2166748046875,
|
| 582 |
-
"x2": 325.1333312988281,
|
| 583 |
-
"y2": 452.2166748046875,
|
| 584 |
-
"width": 816,
|
| 585 |
-
"height": 1056,
|
| 586 |
-
"pageNumber": 4
|
| 587 |
-
},
|
| 588 |
-
{
|
| 589 |
-
"x1": 144.0166778564453,
|
| 590 |
-
"y1": 465.95001220703125,
|
| 591 |
-
"x2": 173.53334045410156,
|
| 592 |
-
"y2": 482.95001220703125,
|
| 593 |
-
"width": 816,
|
| 594 |
-
"height": 1056,
|
| 595 |
-
"pageNumber": 4
|
| 596 |
-
},
|
| 597 |
-
{
|
| 598 |
-
"x1": 183.83334350585938,
|
| 599 |
-
"y1": 465.95001220703125,
|
| 600 |
-
"x2": 352.2166748046875,
|
| 601 |
-
"y2": 482.95001220703125,
|
| 602 |
-
"width": 816,
|
| 603 |
-
"height": 1056,
|
| 604 |
-
"pageNumber": 4
|
| 605 |
-
},
|
| 606 |
-
{
|
| 607 |
-
"x1": 143.36666870117188,
|
| 608 |
-
"y1": 490.8666687011719,
|
| 609 |
-
"x2": 673.2166442871094,
|
| 610 |
-
"y2": 507.8666687011719,
|
| 611 |
-
"width": 816,
|
| 612 |
-
"height": 1056,
|
| 613 |
-
"pageNumber": 4
|
| 614 |
-
},
|
| 615 |
-
{
|
| 616 |
-
"x1": 144.0166778564453,
|
| 617 |
-
"y1": 505.3333435058594,
|
| 618 |
-
"x2": 672.4166870117188,
|
| 619 |
-
"y2": 522.3333435058594,
|
| 620 |
-
"width": 816,
|
| 621 |
-
"height": 1056,
|
| 622 |
-
"pageNumber": 4
|
| 623 |
-
},
|
| 624 |
-
{
|
| 625 |
-
"x1": 333.41668701171875,
|
| 626 |
-
"y1": 511.4166717529297,
|
| 627 |
-
"x2": 340.0333557128906,
|
| 628 |
-
"y2": 524.7000122070312,
|
| 629 |
-
"width": 816,
|
| 630 |
-
"height": 1056,
|
| 631 |
-
"pageNumber": 4
|
| 632 |
-
},
|
| 633 |
-
{
|
| 634 |
-
"x1": 315.7833251953125,
|
| 635 |
-
"y1": 511.5166778564453,
|
| 636 |
-
"x2": 320.01666259765625,
|
| 637 |
-
"y2": 522.5166778564453,
|
| 638 |
-
"width": 816,
|
| 639 |
-
"height": 1056,
|
| 640 |
-
"pageNumber": 4
|
| 641 |
-
},
|
| 642 |
-
{
|
| 643 |
-
"x1": 468.61663818359375,
|
| 644 |
-
"y1": 511.5166778564453,
|
| 645 |
-
"x2": 476.0166931152344,
|
| 646 |
-
"y2": 522.5166778564453,
|
| 647 |
-
"width": 816,
|
| 648 |
-
"height": 1056,
|
| 649 |
-
"pageNumber": 4
|
| 650 |
-
},
|
| 651 |
-
{
|
| 652 |
-
"x1": 144.0166778564453,
|
| 653 |
-
"y1": 519.9166717529297,
|
| 654 |
-
"x2": 333.066650390625,
|
| 655 |
-
"y2": 536.9166717529297,
|
| 656 |
-
"width": 816,
|
| 657 |
-
"height": 1056,
|
| 658 |
-
"pageNumber": 4
|
| 659 |
-
},
|
| 660 |
-
{
|
| 661 |
-
"x1": 344.433349609375,
|
| 662 |
-
"y1": 519.9166717529297,
|
| 663 |
-
"x2": 672.7166748046875,
|
| 664 |
-
"y2": 536.9166717529297,
|
| 665 |
-
"width": 816,
|
| 666 |
-
"height": 1056,
|
| 667 |
-
"pageNumber": 4
|
| 668 |
-
},
|
| 669 |
-
{
|
| 670 |
-
"x1": 351.3666687011719,
|
| 671 |
-
"y1": 526.0833435058594,
|
| 672 |
-
"x2": 355.6000061035156,
|
| 673 |
-
"y2": 537.0833435058594,
|
| 674 |
-
"width": 816,
|
| 675 |
-
"height": 1056,
|
| 676 |
-
"pageNumber": 4
|
| 677 |
-
},
|
| 678 |
-
{
|
| 679 |
-
"x1": 143.68333435058594,
|
| 680 |
-
"y1": 534.4833374023438,
|
| 681 |
-
"x2": 180.7166748046875,
|
| 682 |
-
"y2": 551.4833374023438,
|
| 683 |
-
"width": 816,
|
| 684 |
-
"height": 1056,
|
| 685 |
-
"pageNumber": 4
|
| 686 |
-
},
|
| 687 |
-
{
|
| 688 |
-
"x1": 144.0166778564453,
|
| 689 |
-
"y1": 556.3500061035156,
|
| 690 |
-
"x2": 673.5166778564453,
|
| 691 |
-
"y2": 573.3500061035156,
|
| 692 |
-
"width": 816,
|
| 693 |
-
"height": 1056,
|
| 694 |
-
"pageNumber": 4
|
| 695 |
-
},
|
| 696 |
-
{
|
| 697 |
-
"x1": 144.0166778564453,
|
| 698 |
-
"y1": 570.9166564941406,
|
| 699 |
-
"x2": 672.0999755859375,
|
| 700 |
-
"y2": 587.9166564941406,
|
| 701 |
-
"width": 816,
|
| 702 |
-
"height": 1056,
|
| 703 |
-
"pageNumber": 4
|
| 704 |
-
},
|
| 705 |
-
{
|
| 706 |
-
"x1": 144.0166778564453,
|
| 707 |
-
"y1": 585.3833312988281,
|
| 708 |
-
"x2": 273.48333740234375,
|
| 709 |
-
"y2": 602.3833312988281,
|
| 710 |
-
"width": 816,
|
| 711 |
-
"height": 1056,
|
| 712 |
-
"pageNumber": 4
|
| 713 |
-
},
|
| 714 |
-
{
|
| 715 |
-
"x1": 496.85003662109375,
|
| 716 |
-
"y1": 619.1166687011719,
|
| 717 |
-
"x2": 501.38336181640625,
|
| 718 |
-
"y2": 630.1166687011719,
|
| 719 |
-
"width": 816,
|
| 720 |
-
"height": 1056,
|
| 721 |
-
"pageNumber": 4
|
| 722 |
-
},
|
| 723 |
-
{
|
| 724 |
-
"x1": 474.16668701171875,
|
| 725 |
-
"y1": 619.8166809082031,
|
| 726 |
-
"x2": 496.00001525878906,
|
| 727 |
-
"y2": 636.8166809082031,
|
| 728 |
-
"width": 816,
|
| 729 |
-
"height": 1056,
|
| 730 |
-
"pageNumber": 4
|
| 731 |
-
},
|
| 732 |
-
{
|
| 733 |
-
"x1": 293.26666259765625,
|
| 734 |
-
"y1": 628.7833557128906,
|
| 735 |
-
"x2": 484.0500183105469,
|
| 736 |
-
"y2": 645.7833557128906,
|
| 737 |
-
"width": 816,
|
| 738 |
-
"height": 1056,
|
| 739 |
-
"pageNumber": 4
|
| 740 |
-
},
|
| 741 |
-
{
|
| 742 |
-
"x1": 506.816650390625,
|
| 743 |
-
"y1": 628.7833557128906,
|
| 744 |
-
"x2": 522.7833251953125,
|
| 745 |
-
"y2": 645.7833557128906,
|
| 746 |
-
"width": 816,
|
| 747 |
-
"height": 1056,
|
| 748 |
-
"pageNumber": 4
|
| 749 |
-
},
|
| 750 |
-
{
|
| 751 |
-
"x1": 657.3666381835938,
|
| 752 |
-
"y1": 628.7833557128906,
|
| 753 |
-
"x2": 672.8999786376953,
|
| 754 |
-
"y2": 645.7833557128906,
|
| 755 |
-
"width": 816,
|
| 756 |
-
"height": 1056,
|
| 757 |
-
"pageNumber": 4
|
| 758 |
-
},
|
| 759 |
-
{
|
| 760 |
-
"x1": 488.5333251953125,
|
| 761 |
-
"y1": 638.7166748046875,
|
| 762 |
-
"x2": 495.51666259765625,
|
| 763 |
-
"y2": 655.7166748046875,
|
| 764 |
-
"width": 816,
|
| 765 |
-
"height": 1056,
|
| 766 |
-
"pageNumber": 4
|
| 767 |
-
},
|
| 768 |
-
{
|
| 769 |
-
"x1": 495.38336181640625,
|
| 770 |
-
"y1": 644.7833557128906,
|
| 771 |
-
"x2": 499.61669921875,
|
| 772 |
-
"y2": 655.7833557128906,
|
| 773 |
-
"width": 816,
|
| 774 |
-
"height": 1056,
|
| 775 |
-
"pageNumber": 4
|
| 776 |
-
},
|
| 777 |
-
{
|
| 778 |
-
"x1": 143.60000610351562,
|
| 779 |
-
"y1": 663.8500061035156,
|
| 780 |
-
"x2": 674.5166625976562,
|
| 781 |
-
"y2": 680.8500061035156,
|
| 782 |
-
"width": 816,
|
| 783 |
-
"height": 1056,
|
| 784 |
-
"pageNumber": 4
|
| 785 |
-
},
|
| 786 |
-
{
|
| 787 |
-
"x1": 144.0166778564453,
|
| 788 |
-
"y1": 678.4166564941406,
|
| 789 |
-
"x2": 673.5166778564453,
|
| 790 |
-
"y2": 695.4166564941406,
|
| 791 |
-
"width": 816,
|
| 792 |
-
"height": 1056,
|
| 793 |
-
"pageNumber": 4
|
| 794 |
-
},
|
| 795 |
-
{
|
| 796 |
-
"x1": 144.0166778564453,
|
| 797 |
-
"y1": 692.8833312988281,
|
| 798 |
-
"x2": 158.1999969482422,
|
| 799 |
-
"y2": 709.8833312988281,
|
| 800 |
-
"width": 816,
|
| 801 |
-
"height": 1056,
|
| 802 |
-
"pageNumber": 4
|
| 803 |
-
},
|
| 804 |
-
{
|
| 805 |
-
"x1": 181.7166748046875,
|
| 806 |
-
"y1": 692.8833312988281,
|
| 807 |
-
"x2": 673.1666870117188,
|
| 808 |
-
"y2": 709.8833312988281,
|
| 809 |
-
"width": 816,
|
| 810 |
-
"height": 1056,
|
| 811 |
-
"pageNumber": 4
|
| 812 |
-
},
|
| 813 |
-
{
|
| 814 |
-
"x1": 160.08334350585938,
|
| 815 |
-
"y1": 695.4666748046875,
|
| 816 |
-
"x2": 673.1666870117188,
|
| 817 |
-
"y2": 706.4666748046875,
|
| 818 |
-
"width": 816,
|
| 819 |
-
"height": 1056,
|
| 820 |
-
"pageNumber": 4
|
| 821 |
-
},
|
| 822 |
-
{
|
| 823 |
-
"x1": 168.81666564941406,
|
| 824 |
-
"y1": 702.6499938964844,
|
| 825 |
-
"x2": 177.3000030517578,
|
| 826 |
-
"y2": 713.6499938964844,
|
| 827 |
-
"width": 816,
|
| 828 |
-
"height": 1056,
|
| 829 |
-
"pageNumber": 4
|
| 830 |
-
},
|
| 831 |
-
{
|
| 832 |
-
"x1": 144.0166778564453,
|
| 833 |
-
"y1": 710.5166931152344,
|
| 834 |
-
"x2": 673.3499908447266,
|
| 835 |
-
"y2": 727.5166931152344,
|
| 836 |
-
"width": 816,
|
| 837 |
-
"height": 1056,
|
| 838 |
-
"pageNumber": 4
|
| 839 |
-
},
|
| 840 |
-
{
|
| 841 |
-
"x1": 144.0166778564453,
|
| 842 |
-
"y1": 724.9833374023438,
|
| 843 |
-
"x2": 673.3833160400391,
|
| 844 |
-
"y2": 741.9833374023438,
|
| 845 |
-
"width": 816,
|
| 846 |
-
"height": 1056,
|
| 847 |
-
"pageNumber": 4
|
| 848 |
-
},
|
| 849 |
-
{
|
| 850 |
-
"x1": 144.0166778564453,
|
| 851 |
-
"y1": 739.5666809082031,
|
| 852 |
-
"x2": 288.65000915527344,
|
| 853 |
-
"y2": 756.5666809082031,
|
| 854 |
-
"width": 816,
|
| 855 |
-
"height": 1056,
|
| 856 |
-
"pageNumber": 4
|
| 857 |
-
},
|
| 858 |
-
{
|
| 859 |
-
"x1": 143.36666870117188,
|
| 860 |
-
"y1": 761.4166564941406,
|
| 861 |
-
"x2": 672.9666748046875,
|
| 862 |
-
"y2": 778.4166564941406,
|
| 863 |
-
"width": 816,
|
| 864 |
-
"height": 1056,
|
| 865 |
-
"pageNumber": 4
|
| 866 |
-
},
|
| 867 |
-
{
|
| 868 |
-
"x1": 289.01666259765625,
|
| 869 |
-
"y1": 767.4833374023438,
|
| 870 |
-
"x2": 296.75001525878906,
|
| 871 |
-
"y2": 778.4833374023438,
|
| 872 |
-
"width": 816,
|
| 873 |
-
"height": 1056,
|
| 874 |
-
"pageNumber": 4
|
| 875 |
-
},
|
| 876 |
-
{
|
| 877 |
-
"x1": 144.0166778564453,
|
| 878 |
-
"y1": 775.9833374023438,
|
| 879 |
-
"x2": 672.4499664306641,
|
| 880 |
-
"y2": 792.9833374023438,
|
| 881 |
-
"width": 816,
|
| 882 |
-
"height": 1056,
|
| 883 |
-
"pageNumber": 4
|
| 884 |
-
},
|
| 885 |
-
{
|
| 886 |
-
"x1": 455.4000244140625,
|
| 887 |
-
"y1": 782.0666809082031,
|
| 888 |
-
"x2": 463.13331604003906,
|
| 889 |
-
"y2": 793.0666809082031,
|
| 890 |
-
"width": 816,
|
| 891 |
-
"height": 1056,
|
| 892 |
-
"pageNumber": 4
|
| 893 |
-
},
|
| 894 |
-
{
|
| 895 |
-
"x1": 144.0166778564453,
|
| 896 |
-
"y1": 790.5666809082031,
|
| 897 |
-
"x2": 673.3500366210938,
|
| 898 |
-
"y2": 807.5666809082031,
|
| 899 |
-
"width": 816,
|
| 900 |
-
"height": 1056,
|
| 901 |
-
"pageNumber": 4
|
| 902 |
-
},
|
| 903 |
-
{
|
| 904 |
-
"x1": 150.86666870117188,
|
| 905 |
-
"y1": 796.6333312988281,
|
| 906 |
-
"x2": 155.10000610351562,
|
| 907 |
-
"y2": 807.6333312988281,
|
| 908 |
-
"width": 816,
|
| 909 |
-
"height": 1056,
|
| 910 |
-
"pageNumber": 4
|
| 911 |
-
},
|
| 912 |
-
{
|
| 913 |
-
"x1": 595.0166625976562,
|
| 914 |
-
"y1": 803.9166564941406,
|
| 915 |
-
"x2": 599.7333221435547,
|
| 916 |
-
"y2": 814.9166564941406,
|
| 917 |
-
"width": 816,
|
| 918 |
-
"height": 1056,
|
| 919 |
-
"pageNumber": 4
|
| 920 |
-
},
|
| 921 |
-
{
|
| 922 |
-
"x1": 144.0166778564453,
|
| 923 |
-
"y1": 805.0333557128906,
|
| 924 |
-
"x2": 592.2333679199219,
|
| 925 |
-
"y2": 822.0333557128906,
|
| 926 |
-
"width": 816,
|
| 927 |
-
"height": 1056,
|
| 928 |
-
"pageNumber": 4
|
| 929 |
-
},
|
| 930 |
-
{
|
| 931 |
-
"x1": 596.316650390625,
|
| 932 |
-
"y1": 814.8000183105469,
|
| 933 |
-
"x2": 604.88330078125,
|
| 934 |
-
"y2": 825.8000183105469,
|
| 935 |
-
"width": 816,
|
| 936 |
-
"height": 1056,
|
| 937 |
-
"pageNumber": 4
|
| 938 |
-
}
|
| 939 |
-
]
|
| 940 |
-
},
|
| 941 |
-
"content": {
|
| 942 |
-
"text": "of the values, where the weight assigned to each value is computed by a compatibility function of the\r query with the corresponding key.\r 3.2.1 Scaled Dot-Product Attention\r We call our particular attention \"Scaled Dot-Product Attention\" (Figure 2). The input consists of\r queries and keys of dimension dk, and values of dimension dv . We compute the dot products of the\r query with all keys, divide each by √dk, and apply a softmax function to obtain the weights on the\r values.\r In practice, we compute the attention function on a set of queries simultaneously, packed together\r into a matrix Q. The keys and values are also packed together into matrices K and V . We compute\r the matrix of outputs as:\r Attention(Q, K, V ) = softmax( QKT\r √dk\r )V (1)\r The two most commonly used attention functions are additive attention [ 2], and dot-product (multi-\r plicative) attention. Dot-product attention is identical to our algorithm, except for the scaling factor\r of 1√dk\r . Additive attention computes the compatibility function using a feed-forward network with\r a single hidden layer. While the two are similar in theoretical complexity, dot-product attention is\r much faster and more space-efficient in practice, since it can be implemented using highly optimized\r matrix multiplication code.\r While for small values of dk the two mechanisms perform similarly, additive attention outperforms\r dot product attention without scaling for larger values of dk [3 ]. We suspect that for large values of\r dk, the dot products grow large in magnitude, pushing the softmax function into regions where it has\r extremely small gradients 4. To counteract this effect, we scale the dot products by 1√dk\r "
|
| 943 |
-
}
|
| 944 |
-
}, {
|
| 945 |
-
"id": "highlight_1755777652333",
|
| 946 |
-
"position": {
|
| 947 |
-
"boundingRect": {
|
| 948 |
-
"x1": 143.68333435058594,
|
| 949 |
-
"y1": 932.4833068847656,
|
| 950 |
-
"x2": 671.88330078125,
|
| 951 |
-
"y2": 966.3499755859375,
|
| 952 |
-
"width": 816,
|
| 953 |
-
"height": 1056,
|
| 954 |
-
"pageNumber": 4
|
| 955 |
-
},
|
| 956 |
-
"rects": [
|
| 957 |
-
{
|
| 958 |
-
"x1": 160.83334350585938,
|
| 959 |
-
"y1": 932.4833068847656,
|
| 960 |
-
"x2": 671.7000274658203,
|
| 961 |
-
"y2": 946.4833068847656,
|
| 962 |
-
"width": 816,
|
| 963 |
-
"height": 1056,
|
| 964 |
-
"pageNumber": 4
|
| 965 |
-
},
|
| 966 |
-
{
|
| 967 |
-
"x1": 478.5,
|
| 968 |
-
"y1": 941.3999633789062,
|
| 969 |
-
"x2": 485.7166748046875,
|
| 970 |
-
"y2": 953.3999633789062,
|
| 971 |
-
"width": 816,
|
| 972 |
-
"height": 1056,
|
| 973 |
-
"pageNumber": 4
|
| 974 |
-
},
|
| 975 |
-
{
|
| 976 |
-
"x1": 491.4666748046875,
|
| 977 |
-
"y1": 947.0499877929688,
|
| 978 |
-
"x2": 499.54998779296875,
|
| 979 |
-
"y2": 957.0499877929688,
|
| 980 |
-
"width": 816,
|
| 981 |
-
"height": 1056,
|
| 982 |
-
"pageNumber": 4
|
| 983 |
-
},
|
| 984 |
-
{
|
| 985 |
-
"x1": 143.68333435058594,
|
| 986 |
-
"y1": 949.5833129882812,
|
| 987 |
-
"x2": 477.76666259765625,
|
| 988 |
-
"y2": 963.5833129882812,
|
| 989 |
-
"width": 816,
|
| 990 |
-
"height": 1056,
|
| 991 |
-
"pageNumber": 4
|
| 992 |
-
},
|
| 993 |
-
{
|
| 994 |
-
"x1": 510,
|
| 995 |
-
"y1": 949.5833129882812,
|
| 996 |
-
"x2": 671.88330078125,
|
| 997 |
-
"y2": 963.5833129882812,
|
| 998 |
-
"width": 816,
|
| 999 |
-
"height": 1056,
|
| 1000 |
-
"pageNumber": 4
|
| 1001 |
-
},
|
| 1002 |
-
{
|
| 1003 |
-
"x1": 662.75,
|
| 1004 |
-
"y1": 955.1499633789062,
|
| 1005 |
-
"x2": 669.8499908447266,
|
| 1006 |
-
"y2": 963.1166229248047,
|
| 1007 |
-
"width": 816,
|
| 1008 |
-
"height": 1056,
|
| 1009 |
-
"pageNumber": 4
|
| 1010 |
-
},
|
| 1011 |
-
{
|
| 1012 |
-
"x1": 491.4666748046875,
|
| 1013 |
-
"y1": 956.3499755859375,
|
| 1014 |
-
"x2": 527.9833526611328,
|
| 1015 |
-
"y2": 966.3499755859375,
|
| 1016 |
-
"width": 816,
|
| 1017 |
-
"height": 1056,
|
| 1018 |
-
"pageNumber": 4
|
| 1019 |
-
}
|
| 1020 |
-
]
|
| 1021 |
-
},
|
| 1022 |
-
"content": {
|
| 1023 |
-
"text": "4To illustrate why the dot products get large, assume that the components of q and k are independent random\r variables with mean 0 and variance 1. Then their dot product, q · k = Pdk\r i=1 qiki, has mean 0 and variance dk ."
|
| 1024 |
-
}
|
| 1025 |
-
}],
|
| 1026 |
-
3: [{
|
| 1027 |
-
"id": "highlight_1755776743896",
|
| 1028 |
-
"position": {
|
| 1029 |
-
"boundingRect": {
|
| 1030 |
-
"x1": 143.53334045410156,
|
| 1031 |
-
"y1": 863.8500061035156,
|
| 1032 |
-
"x2": 674.2500152587891,
|
| 1033 |
-
"y2": 924.5333404541016,
|
| 1034 |
-
"width": 816,
|
| 1035 |
-
"height": 1056,
|
| 1036 |
-
"pageNumber": 4
|
| 1037 |
-
},
|
| 1038 |
-
"rects": [
|
| 1039 |
-
{
|
| 1040 |
-
"x1": 144.0166778564453,
|
| 1041 |
-
"y1": 863.8500061035156,
|
| 1042 |
-
"x2": 442.54998779296875,
|
| 1043 |
-
"y2": 880.8500061035156,
|
| 1044 |
-
"width": 816,
|
| 1045 |
-
"height": 1056,
|
| 1046 |
-
"pageNumber": 4
|
| 1047 |
-
},
|
| 1048 |
-
{
|
| 1049 |
-
"x1": 466.41668701171875,
|
| 1050 |
-
"y1": 863.8500061035156,
|
| 1051 |
-
"x2": 674.2500152587891,
|
| 1052 |
-
"y2": 880.8500061035156,
|
| 1053 |
-
"width": 816,
|
| 1054 |
-
"height": 1056,
|
| 1055 |
-
"pageNumber": 4
|
| 1056 |
-
},
|
| 1057 |
-
{
|
| 1058 |
-
"x1": 442.433349609375,
|
| 1059 |
-
"y1": 869.9166717529297,
|
| 1060 |
-
"x2": 465.8166809082031,
|
| 1061 |
-
"y2": 880.9166717529297,
|
| 1062 |
-
"width": 816,
|
| 1063 |
-
"height": 1056,
|
| 1064 |
-
"pageNumber": 4
|
| 1065 |
-
},
|
| 1066 |
-
{
|
| 1067 |
-
"x1": 143.53334045410156,
|
| 1068 |
-
"y1": 878.3166809082031,
|
| 1069 |
-
"x2": 672.4166717529297,
|
| 1070 |
-
"y2": 895.3166809082031,
|
| 1071 |
-
"width": 816,
|
| 1072 |
-
"height": 1056,
|
| 1073 |
-
"pageNumber": 4
|
| 1074 |
-
},
|
| 1075 |
-
{
|
| 1076 |
-
"x1": 144.0166778564453,
|
| 1077 |
-
"y1": 892.8833465576172,
|
| 1078 |
-
"x2": 672.8666687011719,
|
| 1079 |
-
"y2": 909.8833465576172,
|
| 1080 |
-
"width": 816,
|
| 1081 |
-
"height": 1056,
|
| 1082 |
-
"pageNumber": 4
|
| 1083 |
-
},
|
| 1084 |
-
{
|
| 1085 |
-
"x1": 260.8666687011719,
|
| 1086 |
-
"y1": 899.0666809082031,
|
| 1087 |
-
"x2": 265.1000061035156,
|
| 1088 |
-
"y2": 910.0666809082031,
|
| 1089 |
-
"width": 816,
|
| 1090 |
-
"height": 1056,
|
| 1091 |
-
"pageNumber": 4
|
| 1092 |
-
},
|
| 1093 |
-
{
|
| 1094 |
-
"x1": 280.933349609375,
|
| 1095 |
-
"y1": 899.0666809082031,
|
| 1096 |
-
"x2": 288.75001525878906,
|
| 1097 |
-
"y2": 910.0666809082031,
|
| 1098 |
-
"width": 816,
|
| 1099 |
-
"height": 1056,
|
| 1100 |
-
"pageNumber": 4
|
| 1101 |
-
},
|
| 1102 |
-
{
|
| 1103 |
-
"x1": 320.3500061035156,
|
| 1104 |
-
"y1": 899.0666809082031,
|
| 1105 |
-
"x2": 327.75001525878906,
|
| 1106 |
-
"y2": 910.0666809082031,
|
| 1107 |
-
"width": 816,
|
| 1108 |
-
"height": 1056,
|
| 1109 |
-
"pageNumber": 4
|
| 1110 |
-
},
|
| 1111 |
-
{
|
| 1112 |
-
"x1": 144.0166778564453,
|
| 1113 |
-
"y1": 907.4666748046875,
|
| 1114 |
-
"x2": 672.2167053222656,
|
| 1115 |
-
"y2": 924.4666748046875,
|
| 1116 |
-
"width": 816,
|
| 1117 |
-
"height": 1056,
|
| 1118 |
-
"pageNumber": 4
|
| 1119 |
-
},
|
| 1120 |
-
{
|
| 1121 |
-
"x1": 596.4833374023438,
|
| 1122 |
-
"y1": 913.5333404541016,
|
| 1123 |
-
"x2": 603.9666442871094,
|
| 1124 |
-
"y2": 924.5333404541016,
|
| 1125 |
-
"width": 816,
|
| 1126 |
-
"height": 1056,
|
| 1127 |
-
"pageNumber": 4
|
| 1128 |
-
}
|
| 1129 |
-
]
|
| 1130 |
-
},
|
| 1131 |
-
"content": {
|
| 1132 |
-
"text": "Instead of performing a single attention function with dmodel-dimensional keys, values and queries,\r we found it beneficial to linearly project the queries, keys and values h times with different, learned\r linear projections to dk, dk and dv dimensions, respectively. On each of these projected versions of\r queries, keys and values we then perform the attention function in parallel, yielding dv -dimensional"
|
| 1133 |
-
}
|
| 1134 |
-
}, {
|
| 1135 |
-
"id": "highlight_1755776791875",
|
| 1136 |
-
"position": {
|
| 1137 |
-
"boundingRect": {
|
| 1138 |
-
"x1": 143.36666870117188,
|
| 1139 |
-
"y1": 96.98333740234375,
|
| 1140 |
-
"x2": 673.2666778564453,
|
| 1141 |
-
"y2": 356.43333435058594,
|
| 1142 |
-
"width": 816,
|
| 1143 |
-
"height": 1056,
|
| 1144 |
-
"pageNumber": 5
|
| 1145 |
-
},
|
| 1146 |
-
"rects": [
|
| 1147 |
-
{
|
| 1148 |
-
"x1": 144.0166778564453,
|
| 1149 |
-
"y1": 96.98333740234375,
|
| 1150 |
-
"x2": 673.2666778564453,
|
| 1151 |
-
"y2": 113.98333740234375,
|
| 1152 |
-
"width": 816,
|
| 1153 |
-
"height": 1056,
|
| 1154 |
-
"pageNumber": 5
|
| 1155 |
-
},
|
| 1156 |
-
{
|
| 1157 |
-
"x1": 144.0166778564453,
|
| 1158 |
-
"y1": 111.55000305175781,
|
| 1159 |
-
"x2": 254.2166748046875,
|
| 1160 |
-
"y2": 128.5500030517578,
|
| 1161 |
-
"width": 816,
|
| 1162 |
-
"height": 1056,
|
| 1163 |
-
"pageNumber": 5
|
| 1164 |
-
},
|
| 1165 |
-
{
|
| 1166 |
-
"x1": 144.0166778564453,
|
| 1167 |
-
"y1": 133.41665649414062,
|
| 1168 |
-
"x2": 673.2666778564453,
|
| 1169 |
-
"y2": 150.41665649414062,
|
| 1170 |
-
"width": 816,
|
| 1171 |
-
"height": 1056,
|
| 1172 |
-
"pageNumber": 5
|
| 1173 |
-
},
|
| 1174 |
-
{
|
| 1175 |
-
"x1": 144.0166778564453,
|
| 1176 |
-
"y1": 147.98333740234375,
|
| 1177 |
-
"x2": 594.7833404541016,
|
| 1178 |
-
"y2": 164.98333740234375,
|
| 1179 |
-
"width": 816,
|
| 1180 |
-
"height": 1056,
|
| 1181 |
-
"pageNumber": 5
|
| 1182 |
-
},
|
| 1183 |
-
{
|
| 1184 |
-
"x1": 539.7833251953125,
|
| 1185 |
-
"y1": 193.5500030517578,
|
| 1186 |
-
"x2": 545.9333190917969,
|
| 1187 |
-
"y2": 204.5500030517578,
|
| 1188 |
-
"width": 816,
|
| 1189 |
-
"height": 1056,
|
| 1190 |
-
"pageNumber": 5
|
| 1191 |
-
},
|
| 1192 |
-
{
|
| 1193 |
-
"x1": 249.2833251953125,
|
| 1194 |
-
"y1": 194.86666870117188,
|
| 1195 |
-
"x2": 540.9833374023438,
|
| 1196 |
-
"y2": 211.86666870117188,
|
| 1197 |
-
"width": 816,
|
| 1198 |
-
"height": 1056,
|
| 1199 |
-
"pageNumber": 5
|
| 1200 |
-
},
|
| 1201 |
-
{
|
| 1202 |
-
"x1": 457.60003662109375,
|
| 1203 |
-
"y1": 201.0500030517578,
|
| 1204 |
-
"x2": 462.3166961669922,
|
| 1205 |
-
"y2": 212.0500030517578,
|
| 1206 |
-
"width": 816,
|
| 1207 |
-
"height": 1056,
|
| 1208 |
-
"pageNumber": 5
|
| 1209 |
-
},
|
| 1210 |
-
{
|
| 1211 |
-
"x1": 513.75,
|
| 1212 |
-
"y1": 201.0500030517578,
|
| 1213 |
-
"x2": 518.6333312988281,
|
| 1214 |
-
"y2": 212.0500030517578,
|
| 1215 |
-
"width": 816,
|
| 1216 |
-
"height": 1056,
|
| 1217 |
-
"pageNumber": 5
|
| 1218 |
-
},
|
| 1219 |
-
{
|
| 1220 |
-
"x1": 299.29998779296875,
|
| 1221 |
-
"y1": 217.36666870117188,
|
| 1222 |
-
"x2": 565.5999908447266,
|
| 1223 |
-
"y2": 234.36666870117188,
|
| 1224 |
-
"width": 816,
|
| 1225 |
-
"height": 1056,
|
| 1226 |
-
"pageNumber": 5
|
| 1227 |
-
},
|
| 1228 |
-
{
|
| 1229 |
-
"x1": 362.3833312988281,
|
| 1230 |
-
"y1": 223.43333435058594,
|
| 1231 |
-
"x2": 367.50001525878906,
|
| 1232 |
-
"y2": 234.43333435058594,
|
| 1233 |
-
"width": 816,
|
| 1234 |
-
"height": 1056,
|
| 1235 |
-
"pageNumber": 5
|
| 1236 |
-
},
|
| 1237 |
-
{
|
| 1238 |
-
"x1": 509.25,
|
| 1239 |
-
"y1": 224.6999969482422,
|
| 1240 |
-
"x2": 515.1166534423828,
|
| 1241 |
-
"y2": 235.6999969482422,
|
| 1242 |
-
"width": 816,
|
| 1243 |
-
"height": 1056,
|
| 1244 |
-
"pageNumber": 5
|
| 1245 |
-
},
|
| 1246 |
-
{
|
| 1247 |
-
"x1": 550.4666748046875,
|
| 1248 |
-
"y1": 224.6999969482422,
|
| 1249 |
-
"x2": 556.3999938964844,
|
| 1250 |
-
"y2": 235.6999969482422,
|
| 1251 |
-
"width": 816,
|
| 1252 |
-
"height": 1056,
|
| 1253 |
-
"pageNumber": 5
|
| 1254 |
-
},
|
| 1255 |
-
{
|
| 1256 |
-
"x1": 467.7166748046875,
|
| 1257 |
-
"y1": 225.13333129882812,
|
| 1258 |
-
"x2": 473.5833435058594,
|
| 1259 |
-
"y2": 236.13333129882812,
|
| 1260 |
-
"width": 816,
|
| 1261 |
-
"height": 1056,
|
| 1262 |
-
"pageNumber": 5
|
| 1263 |
-
},
|
| 1264 |
-
{
|
| 1265 |
-
"x1": 143.36666870117188,
|
| 1266 |
-
"y1": 272.1666717529297,
|
| 1267 |
-
"x2": 627.4166870117188,
|
| 1268 |
-
"y2": 290.1666717529297,
|
| 1269 |
-
"width": 816,
|
| 1270 |
-
"height": 1056,
|
| 1271 |
-
"pageNumber": 5
|
| 1272 |
-
},
|
| 1273 |
-
{
|
| 1274 |
-
"x1": 619.4166870117188,
|
| 1275 |
-
"y1": 273.9499969482422,
|
| 1276 |
-
"x2": 668.6000366210938,
|
| 1277 |
-
"y2": 286.9499969482422,
|
| 1278 |
-
"width": 816,
|
| 1279 |
-
"height": 1056,
|
| 1280 |
-
"pageNumber": 5
|
| 1281 |
-
},
|
| 1282 |
-
{
|
| 1283 |
-
"x1": 467.066650390625,
|
| 1284 |
-
"y1": 275.1333465576172,
|
| 1285 |
-
"x2": 473.63336181640625,
|
| 1286 |
-
"y2": 283.1333465576172,
|
| 1287 |
-
"width": 816,
|
| 1288 |
-
"height": 1056,
|
| 1289 |
-
"pageNumber": 5
|
| 1290 |
-
},
|
| 1291 |
-
{
|
| 1292 |
-
"x1": 566.8666381835938,
|
| 1293 |
-
"y1": 275.1333465576172,
|
| 1294 |
-
"x2": 573.433349609375,
|
| 1295 |
-
"y2": 283.1333465576172,
|
| 1296 |
-
"width": 816,
|
| 1297 |
-
"height": 1056,
|
| 1298 |
-
"pageNumber": 5
|
| 1299 |
-
},
|
| 1300 |
-
{
|
| 1301 |
-
"x1": 492.2833251953125,
|
| 1302 |
-
"y1": 279.71665954589844,
|
| 1303 |
-
"x2": 498.13331604003906,
|
| 1304 |
-
"y2": 290.71665954589844,
|
| 1305 |
-
"width": 816,
|
| 1306 |
-
"height": 1056,
|
| 1307 |
-
"pageNumber": 5
|
| 1308 |
-
},
|
| 1309 |
-
{
|
| 1310 |
-
"x1": 592.0833129882812,
|
| 1311 |
-
"y1": 279.71665954589844,
|
| 1312 |
-
"x2": 597.9333038330078,
|
| 1313 |
-
"y2": 290.71665954589844,
|
| 1314 |
-
"width": 816,
|
| 1315 |
-
"height": 1056,
|
| 1316 |
-
"pageNumber": 5
|
| 1317 |
-
},
|
| 1318 |
-
{
|
| 1319 |
-
"x1": 393.7166748046875,
|
| 1320 |
-
"y1": 279.93333435058594,
|
| 1321 |
-
"x2": 399.56666564941406,
|
| 1322 |
-
"y2": 290.93333435058594,
|
| 1323 |
-
"width": 816,
|
| 1324 |
-
"height": 1056,
|
| 1325 |
-
"pageNumber": 5
|
| 1326 |
-
},
|
| 1327 |
-
{
|
| 1328 |
-
"x1": 193.63333129882812,
|
| 1329 |
-
"y1": 287.43333435058594,
|
| 1330 |
-
"x2": 268.3000183105469,
|
| 1331 |
-
"y2": 305.43333435058594,
|
| 1332 |
-
"width": 816,
|
| 1333 |
-
"height": 1056,
|
| 1334 |
-
"pageNumber": 5
|
| 1335 |
-
},
|
| 1336 |
-
{
|
| 1337 |
-
"x1": 180.90000915527344,
|
| 1338 |
-
"y1": 287.74998474121094,
|
| 1339 |
-
"x2": 191.08334350585938,
|
| 1340 |
-
"y2": 298.74998474121094,
|
| 1341 |
-
"width": 816,
|
| 1342 |
-
"height": 1056,
|
| 1343 |
-
"pageNumber": 5
|
| 1344 |
-
},
|
| 1345 |
-
{
|
| 1346 |
-
"x1": 144.0166778564453,
|
| 1347 |
-
"y1": 288.43333435058594,
|
| 1348 |
-
"x2": 182.0166778564453,
|
| 1349 |
-
"y2": 305.43333435058594,
|
| 1350 |
-
"width": 816,
|
| 1351 |
-
"height": 1056,
|
| 1352 |
-
"pageNumber": 5
|
| 1353 |
-
},
|
| 1354 |
-
{
|
| 1355 |
-
"x1": 144.0166778564453,
|
| 1356 |
-
"y1": 310.3000030517578,
|
| 1357 |
-
"x2": 672.3000030517578,
|
| 1358 |
-
"y2": 327.3000030517578,
|
| 1359 |
-
"width": 816,
|
| 1360 |
-
"height": 1056,
|
| 1361 |
-
"pageNumber": 5
|
| 1362 |
-
},
|
| 1363 |
-
{
|
| 1364 |
-
"x1": 144.0166778564453,
|
| 1365 |
-
"y1": 324.86668395996094,
|
| 1366 |
-
"x2": 151,
|
| 1367 |
-
"y2": 341.86668395996094,
|
| 1368 |
-
"width": 816,
|
| 1369 |
-
"height": 1056,
|
| 1370 |
-
"pageNumber": 5
|
| 1371 |
-
},
|
| 1372 |
-
{
|
| 1373 |
-
"x1": 161.15000915527344,
|
| 1374 |
-
"y1": 324.86668395996094,
|
| 1375 |
-
"x2": 213.10000610351562,
|
| 1376 |
-
"y2": 341.86668395996094,
|
| 1377 |
-
"width": 816,
|
| 1378 |
-
"height": 1056,
|
| 1379 |
-
"pageNumber": 5
|
| 1380 |
-
},
|
| 1381 |
-
{
|
| 1382 |
-
"x1": 236.95001220703125,
|
| 1383 |
-
"y1": 324.86668395996094,
|
| 1384 |
-
"x2": 672.8666687011719,
|
| 1385 |
-
"y2": 341.86668395996094,
|
| 1386 |
-
"width": 816,
|
| 1387 |
-
"height": 1056,
|
| 1388 |
-
"pageNumber": 5
|
| 1389 |
-
},
|
| 1390 |
-
{
|
| 1391 |
-
"x1": 150.86666870117188,
|
| 1392 |
-
"y1": 330.93333435058594,
|
| 1393 |
-
"x2": 158.68333435058594,
|
| 1394 |
-
"y2": 341.93333435058594,
|
| 1395 |
-
"width": 816,
|
| 1396 |
-
"height": 1056,
|
| 1397 |
-
"pageNumber": 5
|
| 1398 |
-
},
|
| 1399 |
-
{
|
| 1400 |
-
"x1": 182.03334045410156,
|
| 1401 |
-
"y1": 330.93333435058594,
|
| 1402 |
-
"x2": 189.53334045410156,
|
| 1403 |
-
"y2": 341.93333435058594,
|
| 1404 |
-
"width": 816,
|
| 1405 |
-
"height": 1056,
|
| 1406 |
-
"pageNumber": 5
|
| 1407 |
-
},
|
| 1408 |
-
{
|
| 1409 |
-
"x1": 213.04998779296875,
|
| 1410 |
-
"y1": 330.93333435058594,
|
| 1411 |
-
"x2": 236.43331909179688,
|
| 1412 |
-
"y2": 341.93333435058594,
|
| 1413 |
-
"width": 816,
|
| 1414 |
-
"height": 1056,
|
| 1415 |
-
"pageNumber": 5
|
| 1416 |
-
},
|
| 1417 |
-
{
|
| 1418 |
-
"x1": 144.0166778564453,
|
| 1419 |
-
"y1": 339.43333435058594,
|
| 1420 |
-
"x2": 493.1666717529297,
|
| 1421 |
-
"y2": 356.43333435058594,
|
| 1422 |
-
"width": 816,
|
| 1423 |
-
"height": 1056,
|
| 1424 |
-
"pageNumber": 5
|
| 1425 |
-
}
|
| 1426 |
-
]
|
| 1427 |
-
},
|
| 1428 |
-
"content": {
|
| 1429 |
-
"text": "output values. These are concatenated and once again projected, resulting in the final values, as\r depicted in Figure 2.\r Multi-head attention allows the model to jointly attend to information from different representation\r subspaces at different positions. With a single attention head, averaging inhibits this.\r MultiHead(Q, K, V ) = Concat(head1, ..., headh)W O\r where headi = Attention(QW Q\r i , KW K\r i , V W V\r i )\r Where the projections are parameter matrices W Q\r i ∈ Rdmodel×dk , W K\r i ∈ Rdmodel×dk , W V\r i ∈ Rdmodel×dv\r and W O ∈ Rhdv ×dmodel .\r In this work we employ h = 8 parallel attention layers, or heads. For each of these we use\r dk = dv = dmodel/h = 64. Due to the reduced dimension of each head, the total computational cost\r is similar to that of single-head attention with full dimensionality."
|
| 1430 |
-
}
|
| 1431 |
-
}],
|
| 1432 |
-
4: [{
|
| 1433 |
-
"id": "highlight_1755776822210",
|
| 1434 |
-
"position": {
|
| 1435 |
-
"boundingRect": {
|
| 1436 |
-
"x1": 143.60000610351562,
|
| 1437 |
-
"y1": 397,
|
| 1438 |
-
"x2": 674.7833557128906,
|
| 1439 |
-
"y2": 644.1000213623047,
|
| 1440 |
-
"width": 816,
|
| 1441 |
-
"height": 1056,
|
| 1442 |
-
"pageNumber": 5
|
| 1443 |
-
},
|
| 1444 |
-
"rects": [
|
| 1445 |
-
{
|
| 1446 |
-
"x1": 143.60000610351562,
|
| 1447 |
-
"y1": 397,
|
| 1448 |
-
"x2": 497.6833190917969,
|
| 1449 |
-
"y2": 414,
|
| 1450 |
-
"width": 816,
|
| 1451 |
-
"height": 1056,
|
| 1452 |
-
"pageNumber": 5
|
| 1453 |
-
},
|
| 1454 |
-
{
|
| 1455 |
-
"x1": 180.48333740234375,
|
| 1456 |
-
"y1": 424.8666687011719,
|
| 1457 |
-
"x2": 674.7833557128906,
|
| 1458 |
-
"y2": 441.8666687011719,
|
| 1459 |
-
"width": 816,
|
| 1460 |
-
"height": 1056,
|
| 1461 |
-
"pageNumber": 5
|
| 1462 |
-
},
|
| 1463 |
-
{
|
| 1464 |
-
"x1": 191.83334350585938,
|
| 1465 |
-
"y1": 439.3333435058594,
|
| 1466 |
-
"x2": 673.6333312988281,
|
| 1467 |
-
"y2": 456.3333435058594,
|
| 1468 |
-
"width": 816,
|
| 1469 |
-
"height": 1056,
|
| 1470 |
-
"pageNumber": 5
|
| 1471 |
-
},
|
| 1472 |
-
{
|
| 1473 |
-
"x1": 191.83334350585938,
|
| 1474 |
-
"y1": 453.9166717529297,
|
| 1475 |
-
"x2": 673.1999816894531,
|
| 1476 |
-
"y2": 470.9166717529297,
|
| 1477 |
-
"width": 816,
|
| 1478 |
-
"height": 1056,
|
| 1479 |
-
"pageNumber": 5
|
| 1480 |
-
},
|
| 1481 |
-
{
|
| 1482 |
-
"x1": 191.83334350585938,
|
| 1483 |
-
"y1": 468.48333740234375,
|
| 1484 |
-
"x2": 673.2333374023438,
|
| 1485 |
-
"y2": 485.48333740234375,
|
| 1486 |
-
"width": 816,
|
| 1487 |
-
"height": 1056,
|
| 1488 |
-
"pageNumber": 5
|
| 1489 |
-
},
|
| 1490 |
-
{
|
| 1491 |
-
"x1": 191.83334350585938,
|
| 1492 |
-
"y1": 482.95001220703125,
|
| 1493 |
-
"x2": 243.9499969482422,
|
| 1494 |
-
"y2": 499.95001220703125,
|
| 1495 |
-
"width": 816,
|
| 1496 |
-
"height": 1056,
|
| 1497 |
-
"pageNumber": 5
|
| 1498 |
-
},
|
| 1499 |
-
{
|
| 1500 |
-
"x1": 180.48333740234375,
|
| 1501 |
-
"y1": 504.06666564941406,
|
| 1502 |
-
"x2": 673.2000122070312,
|
| 1503 |
-
"y2": 521.0666656494141,
|
| 1504 |
-
"width": 816,
|
| 1505 |
-
"height": 1056,
|
| 1506 |
-
"pageNumber": 5
|
| 1507 |
-
},
|
| 1508 |
-
{
|
| 1509 |
-
"x1": 191.83334350585938,
|
| 1510 |
-
"y1": 518.6500091552734,
|
| 1511 |
-
"x2": 673.1999816894531,
|
| 1512 |
-
"y2": 535.6500091552734,
|
| 1513 |
-
"width": 816,
|
| 1514 |
-
"height": 1056,
|
| 1515 |
-
"pageNumber": 5
|
| 1516 |
-
},
|
| 1517 |
-
{
|
| 1518 |
-
"x1": 191.83334350585938,
|
| 1519 |
-
"y1": 533.2166595458984,
|
| 1520 |
-
"x2": 673.1333312988281,
|
| 1521 |
-
"y2": 550.2166595458984,
|
| 1522 |
-
"width": 816,
|
| 1523 |
-
"height": 1056,
|
| 1524 |
-
"pageNumber": 5
|
| 1525 |
-
},
|
| 1526 |
-
{
|
| 1527 |
-
"x1": 191.83334350585938,
|
| 1528 |
-
"y1": 547.7833404541016,
|
| 1529 |
-
"x2": 236.56666564941406,
|
| 1530 |
-
"y2": 564.7833404541016,
|
| 1531 |
-
"width": 816,
|
| 1532 |
-
"height": 1056,
|
| 1533 |
-
"pageNumber": 5
|
| 1534 |
-
},
|
| 1535 |
-
{
|
| 1536 |
-
"x1": 180.48333740234375,
|
| 1537 |
-
"y1": 568.9166717529297,
|
| 1538 |
-
"x2": 673.2166748046875,
|
| 1539 |
-
"y2": 585.9166717529297,
|
| 1540 |
-
"width": 816,
|
| 1541 |
-
"height": 1056,
|
| 1542 |
-
"pageNumber": 5
|
| 1543 |
-
},
|
| 1544 |
-
{
|
| 1545 |
-
"x1": 191.83334350585938,
|
| 1546 |
-
"y1": 583.3833465576172,
|
| 1547 |
-
"x2": 673.1833190917969,
|
| 1548 |
-
"y2": 600.3833465576172,
|
| 1549 |
-
"width": 816,
|
| 1550 |
-
"height": 1056,
|
| 1551 |
-
"pageNumber": 5
|
| 1552 |
-
},
|
| 1553 |
-
{
|
| 1554 |
-
"x1": 191.83334350585938,
|
| 1555 |
-
"y1": 597.9499969482422,
|
| 1556 |
-
"x2": 673.1833190917969,
|
| 1557 |
-
"y2": 614.9499969482422,
|
| 1558 |
-
"width": 816,
|
| 1559 |
-
"height": 1056,
|
| 1560 |
-
"pageNumber": 5
|
| 1561 |
-
},
|
| 1562 |
-
{
|
| 1563 |
-
"x1": 191.83334350585938,
|
| 1564 |
-
"y1": 612.5166778564453,
|
| 1565 |
-
"x2": 672.3333740234375,
|
| 1566 |
-
"y2": 629.5166778564453,
|
| 1567 |
-
"width": 816,
|
| 1568 |
-
"height": 1056,
|
| 1569 |
-
"pageNumber": 5
|
| 1570 |
-
},
|
| 1571 |
-
{
|
| 1572 |
-
"x1": 191.83334350585938,
|
| 1573 |
-
"y1": 627.1000213623047,
|
| 1574 |
-
"x2": 486.8000183105469,
|
| 1575 |
-
"y2": 644.1000213623047,
|
| 1576 |
-
"width": 816,
|
| 1577 |
-
"height": 1056,
|
| 1578 |
-
"pageNumber": 5
|
| 1579 |
-
}
|
| 1580 |
-
]
|
| 1581 |
-
},
|
| 1582 |
-
"content": {
|
| 1583 |
-
"text": "The Transformer uses multi-head attention in three different ways:\r • In \"encoder-decoder attention\" layers, the queries come from the previous decoder layer,\r and the memory keys and values come from the output of the encoder. This allows every\r position in the decoder to attend over all positions in the input sequence. This mimics the\r typical encoder-decoder attention mechanisms in sequence-to-sequence models such as\r [38, 2, 9].\r • The encoder contains self-attention layers. In a self-attention layer all of the keys, values\r and queries come from the same place, in this case, the output of the previous layer in the\r encoder. Each position in the encoder can attend to all positions in the previous layer of the\r encoder.\r • Similarly, self-attention layers in the decoder allow each position in the decoder to attend to\r all positions in the decoder up to and including that position. We need to prevent leftward\r information flow in the decoder to preserve the auto-regressive property. We implement this\r inside of scaled dot-product attention by masking out (setting to −∞) all values in the input\r of the softmax which correspond to illegal connections"
|
| 1584 |
-
}
|
| 1585 |
-
}],
|
| 1586 |
-
5: [{
|
| 1587 |
-
"id": "highlight_1755776852579",
|
| 1588 |
-
"position": {
|
| 1589 |
-
"boundingRect": {
|
| 1590 |
-
"x1": 143.36666870117188,
|
| 1591 |
-
"y1": 688.2333374023438,
|
| 1592 |
-
"x2": 675.36669921875,
|
| 1593 |
-
"y2": 845.4333190917969,
|
| 1594 |
-
"width": 816,
|
| 1595 |
-
"height": 1056,
|
| 1596 |
-
"pageNumber": 5
|
| 1597 |
-
},
|
| 1598 |
-
"rects": [
|
| 1599 |
-
{
|
| 1600 |
-
"x1": 144.0166778564453,
|
| 1601 |
-
"y1": 688.2333374023438,
|
| 1602 |
-
"x2": 673.6833038330078,
|
| 1603 |
-
"y2": 705.2333374023438,
|
| 1604 |
-
"width": 816,
|
| 1605 |
-
"height": 1056,
|
| 1606 |
-
"pageNumber": 5
|
| 1607 |
-
},
|
| 1608 |
-
{
|
| 1609 |
-
"x1": 144.0166778564453,
|
| 1610 |
-
"y1": 702.8166656494141,
|
| 1611 |
-
"x2": 673.3166656494141,
|
| 1612 |
-
"y2": 719.8166656494141,
|
| 1613 |
-
"width": 816,
|
| 1614 |
-
"height": 1056,
|
| 1615 |
-
"pageNumber": 5
|
| 1616 |
-
},
|
| 1617 |
-
{
|
| 1618 |
-
"x1": 144.0166778564453,
|
| 1619 |
-
"y1": 717.3833312988281,
|
| 1620 |
-
"x2": 536.3666534423828,
|
| 1621 |
-
"y2": 734.3833312988281,
|
| 1622 |
-
"width": 816,
|
| 1623 |
-
"height": 1056,
|
| 1624 |
-
"pageNumber": 5
|
| 1625 |
-
},
|
| 1626 |
-
{
|
| 1627 |
-
"x1": 302.566650390625,
|
| 1628 |
-
"y1": 755.6166687011719,
|
| 1629 |
-
"x2": 508.81663513183594,
|
| 1630 |
-
"y2": 772.6166687011719,
|
| 1631 |
-
"width": 816,
|
| 1632 |
-
"height": 1056,
|
| 1633 |
-
"pageNumber": 5
|
| 1634 |
-
},
|
| 1635 |
-
{
|
| 1636 |
-
"x1": 657.3666381835938,
|
| 1637 |
-
"y1": 755.6166687011719,
|
| 1638 |
-
"x2": 672.8999786376953,
|
| 1639 |
-
"y2": 772.6166687011719,
|
| 1640 |
-
"width": 816,
|
| 1641 |
-
"height": 1056,
|
| 1642 |
-
"pageNumber": 5
|
| 1643 |
-
},
|
| 1644 |
-
{
|
| 1645 |
-
"x1": 428.066650390625,
|
| 1646 |
-
"y1": 761.7833557128906,
|
| 1647 |
-
"x2": 435.4666748046875,
|
| 1648 |
-
"y2": 772.7833557128906,
|
| 1649 |
-
"width": 816,
|
| 1650 |
-
"height": 1056,
|
| 1651 |
-
"pageNumber": 5
|
| 1652 |
-
},
|
| 1653 |
-
{
|
| 1654 |
-
"x1": 455.9666748046875,
|
| 1655 |
-
"y1": 761.7833557128906,
|
| 1656 |
-
"x2": 460.68333435058594,
|
| 1657 |
-
"y2": 772.7833557128906,
|
| 1658 |
-
"width": 816,
|
| 1659 |
-
"height": 1056,
|
| 1660 |
-
"pageNumber": 5
|
| 1661 |
-
},
|
| 1662 |
-
{
|
| 1663 |
-
"x1": 479.63336181640625,
|
| 1664 |
-
"y1": 761.7833557128906,
|
| 1665 |
-
"x2": 486.9666442871094,
|
| 1666 |
-
"y2": 772.7833557128906,
|
| 1667 |
-
"width": 816,
|
| 1668 |
-
"height": 1056,
|
| 1669 |
-
"pageNumber": 5
|
| 1670 |
-
},
|
| 1671 |
-
{
|
| 1672 |
-
"x1": 507.4666748046875,
|
| 1673 |
-
"y1": 761.7833557128906,
|
| 1674 |
-
"x2": 514.8666534423828,
|
| 1675 |
-
"y2": 772.7833557128906,
|
| 1676 |
-
"width": 816,
|
| 1677 |
-
"height": 1056,
|
| 1678 |
-
"pageNumber": 5
|
| 1679 |
-
},
|
| 1680 |
-
{
|
| 1681 |
-
"x1": 143.36666870117188,
|
| 1682 |
-
"y1": 784.6499938964844,
|
| 1683 |
-
"x2": 673.3000183105469,
|
| 1684 |
-
"y2": 801.6499938964844,
|
| 1685 |
-
"width": 816,
|
| 1686 |
-
"height": 1056,
|
| 1687 |
-
"pageNumber": 5
|
| 1688 |
-
},
|
| 1689 |
-
{
|
| 1690 |
-
"x1": 144.0166778564453,
|
| 1691 |
-
"y1": 799.2166748046875,
|
| 1692 |
-
"x2": 675.36669921875,
|
| 1693 |
-
"y2": 816.2166748046875,
|
| 1694 |
-
"width": 816,
|
| 1695 |
-
"height": 1056,
|
| 1696 |
-
"pageNumber": 5
|
| 1697 |
-
},
|
| 1698 |
-
{
|
| 1699 |
-
"x1": 143.60000610351562,
|
| 1700 |
-
"y1": 813.8000183105469,
|
| 1701 |
-
"x2": 386.24998474121094,
|
| 1702 |
-
"y2": 830.8000183105469,
|
| 1703 |
-
"width": 816,
|
| 1704 |
-
"height": 1056,
|
| 1705 |
-
"pageNumber": 5
|
| 1706 |
-
},
|
| 1707 |
-
{
|
| 1708 |
-
"x1": 416.066650390625,
|
| 1709 |
-
"y1": 813.8000183105469,
|
| 1710 |
-
"x2": 673.0833435058594,
|
| 1711 |
-
"y2": 830.8000183105469,
|
| 1712 |
-
"width": 816,
|
| 1713 |
-
"height": 1056,
|
| 1714 |
-
"pageNumber": 5
|
| 1715 |
-
},
|
| 1716 |
-
{
|
| 1717 |
-
"x1": 386.20001220703125,
|
| 1718 |
-
"y1": 819.8666687011719,
|
| 1719 |
-
"x2": 411.56666564941406,
|
| 1720 |
-
"y2": 830.8666687011719,
|
| 1721 |
-
"width": 816,
|
| 1722 |
-
"height": 1056,
|
| 1723 |
-
"pageNumber": 5
|
| 1724 |
-
},
|
| 1725 |
-
{
|
| 1726 |
-
"x1": 144.0166778564453,
|
| 1727 |
-
"y1": 828.3666687011719,
|
| 1728 |
-
"x2": 151,
|
| 1729 |
-
"y2": 845.3666687011719,
|
| 1730 |
-
"width": 816,
|
| 1731 |
-
"height": 1056,
|
| 1732 |
-
"pageNumber": 5
|
| 1733 |
-
},
|
| 1734 |
-
{
|
| 1735 |
-
"x1": 167.68333435058594,
|
| 1736 |
-
"y1": 828.3666687011719,
|
| 1737 |
-
"x2": 211.66665649414062,
|
| 1738 |
-
"y2": 845.3666687011719,
|
| 1739 |
-
"width": 816,
|
| 1740 |
-
"height": 1056,
|
| 1741 |
-
"pageNumber": 5
|
| 1742 |
-
},
|
| 1743 |
-
{
|
| 1744 |
-
"x1": 150.86666870117188,
|
| 1745 |
-
"y1": 834.4333190917969,
|
| 1746 |
-
"x2": 164.40000915527344,
|
| 1747 |
-
"y2": 845.4333190917969,
|
| 1748 |
-
"width": 816,
|
| 1749 |
-
"height": 1056,
|
| 1750 |
-
"pageNumber": 5
|
| 1751 |
-
}
|
| 1752 |
-
]
|
| 1753 |
-
},
|
| 1754 |
-
"content": {
|
| 1755 |
-
"text": "In addition to attention sub-layers, each of the layers in our encoder and decoder contains a fully\r connected feed-forward network, which is applied to each position separately and identically. This\r consists of two linear transformations with a ReLU activation in between.\r FFN(x) = max(0, xW1 + b1)W2 + b2 (2)\r While the linear transformations are the same across different positions, they use different parameters\r from layer to layer. Another way of describing this is as two convolutions with kernel size 1.\r The dimensionality of input and output is dmodel = 512, and the inner-layer has dimensionality\r df f = 2048."
|
| 1756 |
-
}
|
| 1757 |
-
}],
|
| 1758 |
-
6: [{
|
| 1759 |
-
"id": "highlight_1755776884173",
|
| 1760 |
-
"position": {
|
| 1761 |
-
"boundingRect": {
|
| 1762 |
-
"x1": 144.0166778564453,
|
| 1763 |
-
"y1": 889.5166473388672,
|
| 1764 |
-
"x2": 674.816650390625,
|
| 1765 |
-
"y2": 964.8666381835938,
|
| 1766 |
-
"width": 816,
|
| 1767 |
-
"height": 1056,
|
| 1768 |
-
"pageNumber": 5
|
| 1769 |
-
},
|
| 1770 |
-
"rects": [
|
| 1771 |
-
{
|
| 1772 |
-
"x1": 144.0166778564453,
|
| 1773 |
-
"y1": 889.5166473388672,
|
| 1774 |
-
"x2": 673.3999786376953,
|
| 1775 |
-
"y2": 906.5166473388672,
|
| 1776 |
-
"width": 816,
|
| 1777 |
-
"height": 1056,
|
| 1778 |
-
"pageNumber": 5
|
| 1779 |
-
},
|
| 1780 |
-
{
|
| 1781 |
-
"x1": 144.0166778564453,
|
| 1782 |
-
"y1": 904.0833129882812,
|
| 1783 |
-
"x2": 409.183349609375,
|
| 1784 |
-
"y2": 921.0833129882812,
|
| 1785 |
-
"width": 816,
|
| 1786 |
-
"height": 1056,
|
| 1787 |
-
"pageNumber": 5
|
| 1788 |
-
},
|
| 1789 |
-
{
|
| 1790 |
-
"x1": 433.04998779296875,
|
| 1791 |
-
"y1": 904.0833129882812,
|
| 1792 |
-
"x2": 674.816650390625,
|
| 1793 |
-
"y2": 921.0833129882812,
|
| 1794 |
-
"width": 816,
|
| 1795 |
-
"height": 1056,
|
| 1796 |
-
"pageNumber": 5
|
| 1797 |
-
},
|
| 1798 |
-
{
|
| 1799 |
-
"x1": 409.1333312988281,
|
| 1800 |
-
"y1": 910.2666473388672,
|
| 1801 |
-
"x2": 432.5166778564453,
|
| 1802 |
-
"y2": 921.2666473388672,
|
| 1803 |
-
"width": 816,
|
| 1804 |
-
"height": 1056,
|
| 1805 |
-
"pageNumber": 5
|
| 1806 |
-
},
|
| 1807 |
-
{
|
| 1808 |
-
"x1": 144.0166778564453,
|
| 1809 |
-
"y1": 918.6499786376953,
|
| 1810 |
-
"x2": 673.2333526611328,
|
| 1811 |
-
"y2": 935.6499786376953,
|
| 1812 |
-
"width": 816,
|
| 1813 |
-
"height": 1056,
|
| 1814 |
-
"pageNumber": 5
|
| 1815 |
-
},
|
| 1816 |
-
{
|
| 1817 |
-
"x1": 144.0166778564453,
|
| 1818 |
-
"y1": 933.2333068847656,
|
| 1819 |
-
"x2": 673.7166900634766,
|
| 1820 |
-
"y2": 950.2333068847656,
|
| 1821 |
-
"width": 816,
|
| 1822 |
-
"height": 1056,
|
| 1823 |
-
"pageNumber": 5
|
| 1824 |
-
},
|
| 1825 |
-
{
|
| 1826 |
-
"x1": 629.1333618164062,
|
| 1827 |
-
"y1": 939.2999725341797,
|
| 1828 |
-
"x2": 635.7500305175781,
|
| 1829 |
-
"y2": 952.5833129882812,
|
| 1830 |
-
"width": 816,
|
| 1831 |
-
"height": 1056,
|
| 1832 |
-
"pageNumber": 5
|
| 1833 |
-
},
|
| 1834 |
-
{
|
| 1835 |
-
"x1": 144.0166778564453,
|
| 1836 |
-
"y1": 947.6999816894531,
|
| 1837 |
-
"x2": 629.0333251953125,
|
| 1838 |
-
"y2": 964.6999816894531,
|
| 1839 |
-
"width": 816,
|
| 1840 |
-
"height": 1056,
|
| 1841 |
-
"pageNumber": 5
|
| 1842 |
-
},
|
| 1843 |
-
{
|
| 1844 |
-
"x1": 640.2333374023438,
|
| 1845 |
-
"y1": 947.6999816894531,
|
| 1846 |
-
"x2": 647.2166748046875,
|
| 1847 |
-
"y2": 964.6999816894531,
|
| 1848 |
-
"width": 816,
|
| 1849 |
-
"height": 1056,
|
| 1850 |
-
"pageNumber": 5
|
| 1851 |
-
},
|
| 1852 |
-
{
|
| 1853 |
-
"x1": 671.066650390625,
|
| 1854 |
-
"y1": 947.6999816894531,
|
| 1855 |
-
"x2": 674.4166564941406,
|
| 1856 |
-
"y2": 964.6999816894531,
|
| 1857 |
-
"width": 816,
|
| 1858 |
-
"height": 1056,
|
| 1859 |
-
"pageNumber": 5
|
| 1860 |
-
},
|
| 1861 |
-
{
|
| 1862 |
-
"x1": 647.1666870117188,
|
| 1863 |
-
"y1": 953.8666381835938,
|
| 1864 |
-
"x2": 670.5500183105469,
|
| 1865 |
-
"y2": 964.8666381835938,
|
| 1866 |
-
"width": 816,
|
| 1867 |
-
"height": 1056,
|
| 1868 |
-
"pageNumber": 5
|
| 1869 |
-
}
|
| 1870 |
-
]
|
| 1871 |
-
},
|
| 1872 |
-
"content": {
|
| 1873 |
-
"text": "Similarly to other sequence transduction models, we use learned embeddings to convert the input\r tokens and output tokens to vectors of dimension dmodel. We also use the usual learned linear transfor-\r mation and softmax function to convert the decoder output to predicted next-token probabilities. In\r our model, we share the same weight matrix between the two embedding layers and the pre-softmax\r linear transformation, similar to [ 30 ]. In the embedding layers, we multiply those weights by √dmodel."
|
| 1874 |
-
}
|
| 1875 |
-
}, {
|
| 1876 |
-
"id": "highlight_1755776906056",
|
| 1877 |
-
"position": {
|
| 1878 |
-
"boundingRect": {
|
| 1879 |
-
"x1": 143.36666870117188,
|
| 1880 |
-
"y1": 309.6666717529297,
|
| 1881 |
-
"x2": 674.9500274658203,
|
| 1882 |
-
"y2": 641.8833312988281,
|
| 1883 |
-
"width": 816,
|
| 1884 |
-
"height": 1056,
|
| 1885 |
-
"pageNumber": 6
|
| 1886 |
-
},
|
| 1887 |
-
"rects": [
|
| 1888 |
-
{
|
| 1889 |
-
"x1": 144.0166778564453,
|
| 1890 |
-
"y1": 309.6666717529297,
|
| 1891 |
-
"x2": 673.2500152587891,
|
| 1892 |
-
"y2": 326.6666717529297,
|
| 1893 |
-
"width": 816,
|
| 1894 |
-
"height": 1056,
|
| 1895 |
-
"pageNumber": 6
|
| 1896 |
-
},
|
| 1897 |
-
{
|
| 1898 |
-
"x1": 144.0166778564453,
|
| 1899 |
-
"y1": 324.1333312988281,
|
| 1900 |
-
"x2": 673.3166656494141,
|
| 1901 |
-
"y2": 341.1333312988281,
|
| 1902 |
-
"width": 816,
|
| 1903 |
-
"height": 1056,
|
| 1904 |
-
"pageNumber": 6
|
| 1905 |
-
},
|
| 1906 |
-
{
|
| 1907 |
-
"x1": 144.0166778564453,
|
| 1908 |
-
"y1": 338.6999969482422,
|
| 1909 |
-
"x2": 673.2833404541016,
|
| 1910 |
-
"y2": 355.6999969482422,
|
| 1911 |
-
"width": 816,
|
| 1912 |
-
"height": 1056,
|
| 1913 |
-
"pageNumber": 6
|
| 1914 |
-
},
|
| 1915 |
-
{
|
| 1916 |
-
"x1": 144.0166778564453,
|
| 1917 |
-
"y1": 353.2666778564453,
|
| 1918 |
-
"x2": 648.183349609375,
|
| 1919 |
-
"y2": 370.2666778564453,
|
| 1920 |
-
"width": 816,
|
| 1921 |
-
"height": 1056,
|
| 1922 |
-
"pageNumber": 6
|
| 1923 |
-
},
|
| 1924 |
-
{
|
| 1925 |
-
"x1": 648.066650390625,
|
| 1926 |
-
"y1": 359.3500061035156,
|
| 1927 |
-
"x2": 671.4499816894531,
|
| 1928 |
-
"y2": 370.3500061035156,
|
| 1929 |
-
"width": 816,
|
| 1930 |
-
"height": 1056,
|
| 1931 |
-
"pageNumber": 6
|
| 1932 |
-
},
|
| 1933 |
-
{
|
| 1934 |
-
"x1": 144.0166778564453,
|
| 1935 |
-
"y1": 367.75,
|
| 1936 |
-
"x2": 674.9500274658203,
|
| 1937 |
-
"y2": 384.75,
|
| 1938 |
-
"width": 816,
|
| 1939 |
-
"height": 1056,
|
| 1940 |
-
"pageNumber": 6
|
| 1941 |
-
},
|
| 1942 |
-
{
|
| 1943 |
-
"x1": 144.0166778564453,
|
| 1944 |
-
"y1": 382.31666564941406,
|
| 1945 |
-
"x2": 257.68333435058594,
|
| 1946 |
-
"y2": 399.31666564941406,
|
| 1947 |
-
"width": 816,
|
| 1948 |
-
"height": 1056,
|
| 1949 |
-
"pageNumber": 6
|
| 1950 |
-
},
|
| 1951 |
-
{
|
| 1952 |
-
"x1": 144.0166778564453,
|
| 1953 |
-
"y1": 404.1666717529297,
|
| 1954 |
-
"x2": 520.7166900634766,
|
| 1955 |
-
"y2": 421.1666717529297,
|
| 1956 |
-
"width": 816,
|
| 1957 |
-
"height": 1056,
|
| 1958 |
-
"pageNumber": 6
|
| 1959 |
-
},
|
| 1960 |
-
{
|
| 1961 |
-
"x1": 471.9666748046875,
|
| 1962 |
-
"y1": 447.29998779296875,
|
| 1963 |
-
"x2": 510.10003662109375,
|
| 1964 |
-
"y2": 458.29998779296875,
|
| 1965 |
-
"width": 816,
|
| 1966 |
-
"height": 1056,
|
| 1967 |
-
"pageNumber": 6
|
| 1968 |
-
},
|
| 1969 |
-
{
|
| 1970 |
-
"x1": 313.98333740234375,
|
| 1971 |
-
"y1": 448.73333740234375,
|
| 1972 |
-
"x2": 334.1999969482422,
|
| 1973 |
-
"y2": 465.73333740234375,
|
| 1974 |
-
"width": 816,
|
| 1975 |
-
"height": 1056,
|
| 1976 |
-
"pageNumber": 6
|
| 1977 |
-
},
|
| 1978 |
-
{
|
| 1979 |
-
"x1": 374.7833251953125,
|
| 1980 |
-
"y1": 448.73333740234375,
|
| 1981 |
-
"x2": 472.03334045410156,
|
| 1982 |
-
"y2": 465.73333740234375,
|
| 1983 |
-
"width": 816,
|
| 1984 |
-
"height": 1056,
|
| 1985 |
-
"pageNumber": 6
|
| 1986 |
-
},
|
| 1987 |
-
{
|
| 1988 |
-
"x1": 509.91668701171875,
|
| 1989 |
-
"y1": 448.73333740234375,
|
| 1990 |
-
"x2": 513.9500122070312,
|
| 1991 |
-
"y2": 465.73333740234375,
|
| 1992 |
-
"width": 816,
|
| 1993 |
-
"height": 1056,
|
| 1994 |
-
"pageNumber": 6
|
| 1995 |
-
},
|
| 1996 |
-
{
|
| 1997 |
-
"x1": 334.1500244140625,
|
| 1998 |
-
"y1": 455.23333740234375,
|
| 1999 |
-
"x2": 372.5500183105469,
|
| 2000 |
-
"y2": 466.23333740234375,
|
| 2001 |
-
"width": 816,
|
| 2002 |
-
"height": 1056,
|
| 2003 |
-
"pageNumber": 6
|
| 2004 |
-
},
|
| 2005 |
-
{
|
| 2006 |
-
"x1": 471.9666748046875,
|
| 2007 |
-
"y1": 469.79998779296875,
|
| 2008 |
-
"x2": 510.10003662109375,
|
| 2009 |
-
"y2": 480.79998779296875,
|
| 2010 |
-
"width": 816,
|
| 2011 |
-
"height": 1056,
|
| 2012 |
-
"pageNumber": 6
|
| 2013 |
-
},
|
| 2014 |
-
{
|
| 2015 |
-
"x1": 300.933349609375,
|
| 2016 |
-
"y1": 471.23333740234375,
|
| 2017 |
-
"x2": 321.1500244140625,
|
| 2018 |
-
"y2": 488.23333740234375,
|
| 2019 |
-
"width": 816,
|
| 2020 |
-
"height": 1056,
|
| 2021 |
-
"pageNumber": 6
|
| 2022 |
-
},
|
| 2023 |
-
{
|
| 2024 |
-
"x1": 375.1000061035156,
|
| 2025 |
-
"y1": 471.23333740234375,
|
| 2026 |
-
"x2": 472.03334045410156,
|
| 2027 |
-
"y2": 488.23333740234375,
|
| 2028 |
-
"width": 816,
|
| 2029 |
-
"height": 1056,
|
| 2030 |
-
"pageNumber": 6
|
| 2031 |
-
},
|
| 2032 |
-
{
|
| 2033 |
-
"x1": 509.91668701171875,
|
| 2034 |
-
"y1": 471.23333740234375,
|
| 2035 |
-
"x2": 513.9500122070312,
|
| 2036 |
-
"y2": 488.23333740234375,
|
| 2037 |
-
"width": 816,
|
| 2038 |
-
"height": 1056,
|
| 2039 |
-
"pageNumber": 6
|
| 2040 |
-
},
|
| 2041 |
-
{
|
| 2042 |
-
"x1": 321.0833435058594,
|
| 2043 |
-
"y1": 477.7166748046875,
|
| 2044 |
-
"x2": 372.88331604003906,
|
| 2045 |
-
"y2": 488.7166748046875,
|
| 2046 |
-
"width": 816,
|
| 2047 |
-
"height": 1056,
|
| 2048 |
-
"pageNumber": 6
|
| 2049 |
-
},
|
| 2050 |
-
{
|
| 2051 |
-
"x1": 143.53334045410156,
|
| 2052 |
-
"y1": 501.1166687011719,
|
| 2053 |
-
"x2": 672.8666687011719,
|
| 2054 |
-
"y2": 518.1166687011719,
|
| 2055 |
-
"width": 816,
|
| 2056 |
-
"height": 1056,
|
| 2057 |
-
"pageNumber": 6
|
| 2058 |
-
},
|
| 2059 |
-
{
|
| 2060 |
-
"x1": 144.0166778564453,
|
| 2061 |
-
"y1": 515.683349609375,
|
| 2062 |
-
"x2": 672.0500183105469,
|
| 2063 |
-
"y2": 532.683349609375,
|
| 2064 |
-
"width": 816,
|
| 2065 |
-
"height": 1056,
|
| 2066 |
-
"pageNumber": 6
|
| 2067 |
-
},
|
| 2068 |
-
{
|
| 2069 |
-
"x1": 144.0166778564453,
|
| 2070 |
-
"y1": 530.2666625976562,
|
| 2071 |
-
"x2": 673.7833404541016,
|
| 2072 |
-
"y2": 547.2666625976562,
|
| 2073 |
-
"width": 816,
|
| 2074 |
-
"height": 1056,
|
| 2075 |
-
"pageNumber": 6
|
| 2076 |
-
},
|
| 2077 |
-
{
|
| 2078 |
-
"x1": 144.0166778564453,
|
| 2079 |
-
"y1": 544.8333435058594,
|
| 2080 |
-
"x2": 415.4000244140625,
|
| 2081 |
-
"y2": 561.8333435058594,
|
| 2082 |
-
"width": 816,
|
| 2083 |
-
"height": 1056,
|
| 2084 |
-
"pageNumber": 6
|
| 2085 |
-
},
|
| 2086 |
-
{
|
| 2087 |
-
"x1": 449.20001220703125,
|
| 2088 |
-
"y1": 544.8333435058594,
|
| 2089 |
-
"x2": 672.433349609375,
|
| 2090 |
-
"y2": 561.8333435058594,
|
| 2091 |
-
"width": 816,
|
| 2092 |
-
"height": 1056,
|
| 2093 |
-
"pageNumber": 6
|
| 2094 |
-
},
|
| 2095 |
-
{
|
| 2096 |
-
"x1": 415.3333435058594,
|
| 2097 |
-
"y1": 550.9000244140625,
|
| 2098 |
-
"x2": 446.9666748046875,
|
| 2099 |
-
"y2": 561.9000244140625,
|
| 2100 |
-
"width": 816,
|
| 2101 |
-
"height": 1056,
|
| 2102 |
-
"pageNumber": 6
|
| 2103 |
-
},
|
| 2104 |
-
{
|
| 2105 |
-
"x1": 144.0166778564453,
|
| 2106 |
-
"y1": 559.2999877929688,
|
| 2107 |
-
"x2": 164.23333740234375,
|
| 2108 |
-
"y2": 576.2999877929688,
|
| 2109 |
-
"width": 816,
|
| 2110 |
-
"height": 1056,
|
| 2111 |
-
"pageNumber": 6
|
| 2112 |
-
},
|
| 2113 |
-
{
|
| 2114 |
-
"x1": 180.56666564941406,
|
| 2115 |
-
"y1": 559.2999877929688,
|
| 2116 |
-
"x2": 183.9166717529297,
|
| 2117 |
-
"y2": 576.2999877929688,
|
| 2118 |
-
"width": 816,
|
| 2119 |
-
"height": 1056,
|
| 2120 |
-
"pageNumber": 6
|
| 2121 |
-
},
|
| 2122 |
-
{
|
| 2123 |
-
"x1": 164.1666717529297,
|
| 2124 |
-
"y1": 565.4666748046875,
|
| 2125 |
-
"x2": 179.9666748046875,
|
| 2126 |
-
"y2": 576.4666748046875,
|
| 2127 |
-
"width": 816,
|
| 2128 |
-
"height": 1056,
|
| 2129 |
-
"pageNumber": 6
|
| 2130 |
-
},
|
| 2131 |
-
{
|
| 2132 |
-
"x1": 143.36666870117188,
|
| 2133 |
-
"y1": 581.1666870117188,
|
| 2134 |
-
"x2": 672.3666381835938,
|
| 2135 |
-
"y2": 598.1666870117188,
|
| 2136 |
-
"width": 816,
|
| 2137 |
-
"height": 1056,
|
| 2138 |
-
"pageNumber": 6
|
| 2139 |
-
},
|
| 2140 |
-
{
|
| 2141 |
-
"x1": 143.68333435058594,
|
| 2142 |
-
"y1": 595.7333374023438,
|
| 2143 |
-
"x2": 673.3666839599609,
|
| 2144 |
-
"y2": 612.7333374023438,
|
| 2145 |
-
"width": 816,
|
| 2146 |
-
"height": 1056,
|
| 2147 |
-
"pageNumber": 6
|
| 2148 |
-
},
|
| 2149 |
-
{
|
| 2150 |
-
"x1": 144.0166778564453,
|
| 2151 |
-
"y1": 610.2999877929688,
|
| 2152 |
-
"x2": 673.2666778564453,
|
| 2153 |
-
"y2": 627.2999877929688,
|
| 2154 |
-
"width": 816,
|
| 2155 |
-
"height": 1056,
|
| 2156 |
-
"pageNumber": 6
|
| 2157 |
-
},
|
| 2158 |
-
{
|
| 2159 |
-
"x1": 144.0166778564453,
|
| 2160 |
-
"y1": 624.8833312988281,
|
| 2161 |
-
"x2": 226.90000915527344,
|
| 2162 |
-
"y2": 641.8833312988281,
|
| 2163 |
-
"width": 816,
|
| 2164 |
-
"height": 1056,
|
| 2165 |
-
"pageNumber": 6
|
| 2166 |
-
}
|
| 2167 |
-
]
|
| 2168 |
-
},
|
| 2169 |
-
"content": {
|
| 2170 |
-
"text": "Since our model contains no recurrence and no convolution, in order for the model to make use of the\r order of the sequence, we must inject some information about the relative or absolute position of the\r tokens in the sequence. To this end, we add \"positional encodings\" to the input embeddings at the\r bottoms of the encoder and decoder stacks. The positional encodings have the same dimension dmodel\r as the embeddings, so that the two can be summed. There are many choices of positional encodings,\r learned and fixed [9].\r In this work, we use sine and cosine functions of different frequencies:\r P E(pos,2i) = sin(pos/100002i/dmodel )\r P E(pos,2i+1) = cos(pos/100002i/dmodel )\r where pos is the position and i is the dimension. That is, each dimension of the positional encoding\r corresponds to a sinusoid. The wavelengths form a geometric progression from 2π to 10000 · 2π. We\r chose this function because we hypothesized it would allow the model to easily learn to attend by\r relative positions, since for any fixed offset k, P Epos+k can be represented as a linear function of\r P Epos.\r We also experimented with using learned positional embeddings [9] instead, and found that the two\r versions produced nearly identical results (see Table 3 row (E)). We chose the sinusoidal version\r because it may allow the model to extrapolate to sequence lengths longer than the ones encountered\r during training."
|
| 2171 |
-
}
|
| 2172 |
-
}],
|
| 2173 |
-
7: [{
|
| 2174 |
-
"id": "highlight_1755776991536",
|
| 2175 |
-
"position": {
|
| 2176 |
-
"boundingRect": {
|
| 2177 |
-
"x1": 143.53334045410156,
|
| 2178 |
-
"y1": 736.7000274658203,
|
| 2179 |
-
"x2": 673.7833404541016,
|
| 2180 |
-
"y2": 964.7000122070312,
|
| 2181 |
-
"width": 816,
|
| 2182 |
-
"height": 1056,
|
| 2183 |
-
"pageNumber": 6
|
| 2184 |
-
},
|
| 2185 |
-
"rects": [
|
| 2186 |
-
{
|
| 2187 |
-
"x1": 466.51666259765625,
|
| 2188 |
-
"y1": 736.7000274658203,
|
| 2189 |
-
"x2": 673.2833251953125,
|
| 2190 |
-
"y2": 753.7000274658203,
|
| 2191 |
-
"width": 816,
|
| 2192 |
-
"height": 1056,
|
| 2193 |
-
"pageNumber": 6
|
| 2194 |
-
},
|
| 2195 |
-
{
|
| 2196 |
-
"x1": 144.0166778564453,
|
| 2197 |
-
"y1": 751.2833557128906,
|
| 2198 |
-
"x2": 279.7166748046875,
|
| 2199 |
-
"y2": 768.2833557128906,
|
| 2200 |
-
"width": 816,
|
| 2201 |
-
"height": 1056,
|
| 2202 |
-
"pageNumber": 6
|
| 2203 |
-
},
|
| 2204 |
-
{
|
| 2205 |
-
"x1": 144.0166778564453,
|
| 2206 |
-
"y1": 773.1333465576172,
|
| 2207 |
-
"x2": 673.2500152587891,
|
| 2208 |
-
"y2": 790.1333465576172,
|
| 2209 |
-
"width": 816,
|
| 2210 |
-
"height": 1056,
|
| 2211 |
-
"pageNumber": 6
|
| 2212 |
-
},
|
| 2213 |
-
{
|
| 2214 |
-
"x1": 144.0166778564453,
|
| 2215 |
-
"y1": 787.7166748046875,
|
| 2216 |
-
"x2": 611.1333160400391,
|
| 2217 |
-
"y2": 804.7166748046875,
|
| 2218 |
-
"width": 816,
|
| 2219 |
-
"height": 1056,
|
| 2220 |
-
"pageNumber": 6
|
| 2221 |
-
},
|
| 2222 |
-
{
|
| 2223 |
-
"x1": 143.60000610351562,
|
| 2224 |
-
"y1": 809.4666748046875,
|
| 2225 |
-
"x2": 673.3999938964844,
|
| 2226 |
-
"y2": 826.4666748046875,
|
| 2227 |
-
"width": 816,
|
| 2228 |
-
"height": 1056,
|
| 2229 |
-
"pageNumber": 6
|
| 2230 |
-
},
|
| 2231 |
-
{
|
| 2232 |
-
"x1": 144.0166778564453,
|
| 2233 |
-
"y1": 824.0333557128906,
|
| 2234 |
-
"x2": 673.3000030517578,
|
| 2235 |
-
"y2": 841.0333557128906,
|
| 2236 |
-
"width": 816,
|
| 2237 |
-
"height": 1056,
|
| 2238 |
-
"pageNumber": 6
|
| 2239 |
-
},
|
| 2240 |
-
{
|
| 2241 |
-
"x1": 144.0166778564453,
|
| 2242 |
-
"y1": 838.61669921875,
|
| 2243 |
-
"x2": 673.3499908447266,
|
| 2244 |
-
"y2": 855.61669921875,
|
| 2245 |
-
"width": 816,
|
| 2246 |
-
"height": 1056,
|
| 2247 |
-
"pageNumber": 6
|
| 2248 |
-
},
|
| 2249 |
-
{
|
| 2250 |
-
"x1": 144.0166778564453,
|
| 2251 |
-
"y1": 853.183349609375,
|
| 2252 |
-
"x2": 673.3000030517578,
|
| 2253 |
-
"y2": 870.183349609375,
|
| 2254 |
-
"width": 816,
|
| 2255 |
-
"height": 1056,
|
| 2256 |
-
"pageNumber": 6
|
| 2257 |
-
},
|
| 2258 |
-
{
|
| 2259 |
-
"x1": 144.0166778564453,
|
| 2260 |
-
"y1": 867.6500244140625,
|
| 2261 |
-
"x2": 672.3333282470703,
|
| 2262 |
-
"y2": 884.6500244140625,
|
| 2263 |
-
"width": 816,
|
| 2264 |
-
"height": 1056,
|
| 2265 |
-
"pageNumber": 6
|
| 2266 |
-
},
|
| 2267 |
-
{
|
| 2268 |
-
"x1": 144.0166778564453,
|
| 2269 |
-
"y1": 882.2166748046875,
|
| 2270 |
-
"x2": 673.3333282470703,
|
| 2271 |
-
"y2": 899.2166748046875,
|
| 2272 |
-
"width": 816,
|
| 2273 |
-
"height": 1056,
|
| 2274 |
-
"pageNumber": 6
|
| 2275 |
-
},
|
| 2276 |
-
{
|
| 2277 |
-
"x1": 144.0166778564453,
|
| 2278 |
-
"y1": 896.8000183105469,
|
| 2279 |
-
"x2": 254.25,
|
| 2280 |
-
"y2": 913.8000183105469,
|
| 2281 |
-
"width": 816,
|
| 2282 |
-
"height": 1056,
|
| 2283 |
-
"pageNumber": 6
|
| 2284 |
-
},
|
| 2285 |
-
{
|
| 2286 |
-
"x1": 143.53334045410156,
|
| 2287 |
-
"y1": 918.6500244140625,
|
| 2288 |
-
"x2": 673.7833404541016,
|
| 2289 |
-
"y2": 935.6500244140625,
|
| 2290 |
-
"width": 816,
|
| 2291 |
-
"height": 1056,
|
| 2292 |
-
"pageNumber": 6
|
| 2293 |
-
},
|
| 2294 |
-
{
|
| 2295 |
-
"x1": 144.0166778564453,
|
| 2296 |
-
"y1": 933.2333374023438,
|
| 2297 |
-
"x2": 672.4333190917969,
|
| 2298 |
-
"y2": 950.2333374023438,
|
| 2299 |
-
"width": 816,
|
| 2300 |
-
"height": 1056,
|
| 2301 |
-
"pageNumber": 6
|
| 2302 |
-
},
|
| 2303 |
-
{
|
| 2304 |
-
"x1": 144.0166778564453,
|
| 2305 |
-
"y1": 947.7000122070312,
|
| 2306 |
-
"x2": 673.3000030517578,
|
| 2307 |
-
"y2": 964.7000122070312,
|
| 2308 |
-
"width": 816,
|
| 2309 |
-
"height": 1056,
|
| 2310 |
-
"pageNumber": 6
|
| 2311 |
-
}
|
| 2312 |
-
]
|
| 2313 |
-
},
|
| 2314 |
-
"content": {
|
| 2315 |
-
"text": "Motivating our use of self-attention we\r consider three desiderata.\r One is the total computational complexity per layer. Another is the amount of computation that can\r be parallelized, as measured by the minimum number of sequential operations required.\r The third is the path length between long-range dependencies in the network. Learning long-range\r dependencies is a key challenge in many sequence transduction tasks. One key factor affecting the\r ability to learn such dependencies is the length of the paths forward and backward signals have to\r traverse in the network. The shorter these paths between any combination of positions in the input\r and output sequences, the easier it is to learn long-range dependencies [12]. Hence we also compare\r the maximum path length between any two input and output positions in networks composed of the\r different layer types.\r As noted in Table 1, a self-attention layer connects all positions with a constant number of sequentially\r executed operations, whereas a recurrent layer requires O(n) sequential operations. In terms of\r computational complexity, self-attention layers are faster than recurrent layers when the sequence"
|
| 2316 |
-
}
|
| 2317 |
-
}, {
|
| 2318 |
-
"id": "highlight_1755777048311",
|
| 2319 |
-
"position": {
|
| 2320 |
-
"boundingRect": {
|
| 2321 |
-
"x1": 144.0166778564453,
|
| 2322 |
-
"y1": 96.98333740234375,
|
| 2323 |
-
"x2": 672.5,
|
| 2324 |
-
"y2": 128.55001831054688,
|
| 2325 |
-
"width": 816,
|
| 2326 |
-
"height": 1056,
|
| 2327 |
-
"pageNumber": 7
|
| 2328 |
-
},
|
| 2329 |
-
"rects": [
|
| 2330 |
-
{
|
| 2331 |
-
"x1": 144.0166778564453,
|
| 2332 |
-
"y1": 96.98333740234375,
|
| 2333 |
-
"x2": 672.5,
|
| 2334 |
-
"y2": 113.98333740234375,
|
| 2335 |
-
"width": 816,
|
| 2336 |
-
"height": 1056,
|
| 2337 |
-
"pageNumber": 7
|
| 2338 |
-
},
|
| 2339 |
-
{
|
| 2340 |
-
"x1": 144.0166778564453,
|
| 2341 |
-
"y1": 111.55001831054688,
|
| 2342 |
-
"x2": 271.1000061035156,
|
| 2343 |
-
"y2": 128.55001831054688,
|
| 2344 |
-
"width": 816,
|
| 2345 |
-
"height": 1056,
|
| 2346 |
-
"pageNumber": 7
|
| 2347 |
-
}
|
| 2348 |
-
]
|
| 2349 |
-
},
|
| 2350 |
-
"content": {
|
| 2351 |
-
"text": "length n is smaller than the representation dimensionality d, which is most often the case with\r sentence representation"
|
| 2352 |
-
}
|
| 2353 |
-
}]
|
| 2354 |
-
};
|
| 2355 |
|
| 2356 |
// Temporarily inject test highlights into documentData for testing
|
| 2357 |
const documentDataWithHighlights = documentData ? {
|
|
@@ -2374,47 +154,6 @@ function DocumentProcessor() {
|
|
| 2374 |
startInteractiveLesson();
|
| 2375 |
};
|
| 2376 |
|
| 2377 |
-
// Early returns for different states
|
| 2378 |
-
if (!selectedFile) {
|
| 2379 |
-
return (
|
| 2380 |
-
<div className="h-screen bg-gray-50 flex items-center justify-center">
|
| 2381 |
-
<input
|
| 2382 |
-
ref={fileInputRef}
|
| 2383 |
-
type="file"
|
| 2384 |
-
accept=".pdf"
|
| 2385 |
-
className="hidden"
|
| 2386 |
-
onChange={handleFileChange}
|
| 2387 |
-
/>
|
| 2388 |
-
<button
|
| 2389 |
-
onClick={() => fileInputRef.current.click()}
|
| 2390 |
-
className="px-6 py-3 bg-white shadow-md hover:shadow-lg text-gray-700 font-medium rounded-lg transition-all"
|
| 2391 |
-
>
|
| 2392 |
-
Select PDF
|
| 2393 |
-
</button>
|
| 2394 |
-
</div>
|
| 2395 |
-
);
|
| 2396 |
-
}
|
| 2397 |
-
|
| 2398 |
-
if (!documentData) {
|
| 2399 |
-
return (
|
| 2400 |
-
<div className="h-screen bg-gray-50 flex items-center justify-center">
|
| 2401 |
-
<div className="flex gap-4">
|
| 2402 |
-
<button
|
| 2403 |
-
onClick={processDocument}
|
| 2404 |
-
className="px-6 py-3 bg-white shadow-md hover:shadow-lg text-gray-700 font-medium rounded-lg transition-all"
|
| 2405 |
-
>
|
| 2406 |
-
Process
|
| 2407 |
-
</button>
|
| 2408 |
-
<button
|
| 2409 |
-
onClick={() => setSelectedFile(null)}
|
| 2410 |
-
className="px-6 py-3 bg-white shadow-md hover:shadow-lg text-gray-700 font-medium rounded-lg transition-all"
|
| 2411 |
-
>
|
| 2412 |
-
← Back
|
| 2413 |
-
</button>
|
| 2414 |
-
</div>
|
| 2415 |
-
</div>
|
| 2416 |
-
);
|
| 2417 |
-
}
|
| 2418 |
|
| 2419 |
// Main render
|
| 2420 |
return (
|
|
@@ -2468,9 +207,15 @@ function DocumentProcessor() {
|
|
| 2468 |
/>
|
| 2469 |
</div>
|
| 2470 |
|
| 2471 |
-
{/* Right Panel Content - Welcome Screen or Chunk Panel */}
|
| 2472 |
<div className="flex-1 flex flex-col min-h-0 bg-white rounded-lg shadow-sm">
|
| 2473 |
-
{
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 2474 |
<WelcomeScreen onGetStarted={handleGetStarted} />
|
| 2475 |
) : (
|
| 2476 |
<ChunkPanel
|
|
|
|
| 12 |
import ChunkPanel from './ChunkPanel';
|
| 13 |
import ProgressBar from './ProgressBar';
|
| 14 |
import WelcomeScreen from './WelcomeScreen';
|
| 15 |
+
import OnboardingWizard from './OnboardingWizard';
|
| 16 |
|
| 17 |
+
import Highlights from '../highlights.json';
|
| 18 |
+
|
| 19 |
+
function DocumentProcessor({ initialFile, fileName, academicBackground }) {
|
| 20 |
+
const onboardingData = {
|
| 21 |
+
hasFile: !!initialFile,
|
| 22 |
+
fileName,
|
| 23 |
+
academicBackground
|
| 24 |
+
};
|
| 25 |
+
|
| 26 |
// State for PDF navigation
|
| 27 |
const [pdfNavigation, setPdfNavigation] = useState(null);
|
| 28 |
// State for first LLM response loading
|
| 29 |
const [waitingForFirstResponse, setWaitingForFirstResponse] = useState(false);
|
| 30 |
+
// State for welcome screen visibility - if coming from onboarding, skip welcome
|
| 31 |
+
const [showWelcomeScreen, setShowWelcomeScreen] = useState(!onboardingData?.hasFile);
|
| 32 |
// State for document controls (like scrollToPage)
|
| 33 |
const [documentControls, setDocumentControls] = useState(null);
|
| 34 |
+
// State for onboarding wizard
|
| 35 |
+
const [showOnboarding, setShowOnboarding] = useState(!!onboardingData?.hasFile);
|
| 36 |
+
const [onboardingCompleted, setOnboardingCompleted] = useState(false);
|
| 37 |
+
|
| 38 |
+
// Custom hooks
|
| 39 |
+
const {
|
| 40 |
+
fileInputRef,
|
| 41 |
+
selectedFile,
|
| 42 |
+
processing,
|
| 43 |
+
uploadProgress,
|
| 44 |
+
documentData,
|
| 45 |
+
handleFileChange,
|
| 46 |
+
processDocument,
|
| 47 |
+
setSelectedFile
|
| 48 |
+
} = useDocumentProcessor();
|
| 49 |
+
|
| 50 |
+
// Auto-load document when coming from onboarding
|
| 51 |
+
useEffect(() => {
|
| 52 |
+
if (onboardingData?.hasFile && initialFile && !selectedFile && !documentData) {
|
| 53 |
+
// Use the actual uploaded file
|
| 54 |
+
setSelectedFile(initialFile);
|
| 55 |
+
setTimeout(() => {
|
| 56 |
+
processDocument();
|
| 57 |
+
}, 500);
|
| 58 |
+
}
|
| 59 |
+
}, [onboardingData, selectedFile, documentData, setSelectedFile, processDocument, initialFile]);
|
| 60 |
|
| 61 |
// Function to get the page number of the first chunk
|
| 62 |
const getFirstChunkPage = () => {
|
|
|
|
| 84 |
}, 2000);
|
| 85 |
}
|
| 86 |
};
|
| 87 |
+
|
| 88 |
+
// Function to handle onboarding completion
|
| 89 |
+
const handleOnboardingComplete = (onboardingPreferences) => {
|
| 90 |
+
console.log('Onboarding preferences:', onboardingPreferences);
|
| 91 |
+
setShowOnboarding(false);
|
| 92 |
+
setOnboardingCompleted(true);
|
| 93 |
+
// TODO: Store onboarding preferences for use in tutoring
|
| 94 |
+
|
| 95 |
+
// Scroll to the first chunk after a short delay
|
| 96 |
+
if (documentControls && documentControls.scrollToFirstChunk) {
|
| 97 |
+
setTimeout(() => {
|
| 98 |
+
documentControls.scrollToFirstChunk();
|
| 99 |
+
}, 500);
|
| 100 |
+
}
|
| 101 |
+
};
|
| 102 |
|
| 103 |
const {
|
| 104 |
chunkStates,
|
|
|
|
| 131 |
|
| 132 |
// Add test preloaded highlights data - keyed by chunk index
|
| 133 |
// Lennart version
|
| 134 |
+
const testPreloadedHighlights = Highlights;
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 135 |
|
| 136 |
// Temporarily inject test highlights into documentData for testing
|
| 137 |
const documentDataWithHighlights = documentData ? {
|
|
|
|
| 154 |
startInteractiveLesson();
|
| 155 |
};
|
| 156 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 157 |
|
| 158 |
// Main render
|
| 159 |
return (
|
|
|
|
| 207 |
/>
|
| 208 |
</div>
|
| 209 |
|
| 210 |
+
{/* Right Panel Content - Welcome Screen, Onboarding, or Chunk Panel */}
|
| 211 |
<div className="flex-1 flex flex-col min-h-0 bg-white rounded-lg shadow-sm">
|
| 212 |
+
{showOnboarding ? (
|
| 213 |
+
<OnboardingWizard
|
| 214 |
+
fileName={onboardingData?.fileName || 'Unknown Paper'}
|
| 215 |
+
academicBackground={onboardingData?.academicBackground || ''}
|
| 216 |
+
onComplete={handleOnboardingComplete}
|
| 217 |
+
/>
|
| 218 |
+
) : showWelcomeScreen ? (
|
| 219 |
<WelcomeScreen onGetStarted={handleGetStarted} />
|
| 220 |
) : (
|
| 221 |
<ChunkPanel
|
|
@@ -123,12 +123,12 @@ const DocumentViewer = ({ selectedFile, documentData, onPageChange, preloadedHig
|
|
| 123 |
};
|
| 124 |
|
| 125 |
// Call onDocumentReady only once when utils become available
|
| 126 |
-
|
| 127 |
if (onDocumentReady && !documentReadyCalledRef.current && highlighterUtilsRef.current) {
|
| 128 |
documentReadyCalledRef.current = true;
|
| 129 |
onDocumentReady({ scrollToFirstChunk });
|
| 130 |
}
|
| 131 |
-
};
|
| 132 |
|
| 133 |
// Utility function to normalize highlight data
|
| 134 |
const normalizeHighlight = (highlightData) => {
|
|
@@ -280,7 +280,6 @@ const DocumentViewer = ({ selectedFile, documentData, onPageChange, preloadedHig
|
|
| 280 |
pdfScaleValue={zoom}
|
| 281 |
utilsRef={(_pdfHighlighterUtils) => {
|
| 282 |
highlighterUtilsRef.current = _pdfHighlighterUtils;
|
| 283 |
-
callOnDocumentReady();
|
| 284 |
}}
|
| 285 |
highlights={highlights}
|
| 286 |
onSelection={handleSelection}
|
|
|
|
| 123 |
};
|
| 124 |
|
| 125 |
// Call onDocumentReady only once when utils become available
|
| 126 |
+
useEffect(() => {
|
| 127 |
if (onDocumentReady && !documentReadyCalledRef.current && highlighterUtilsRef.current) {
|
| 128 |
documentReadyCalledRef.current = true;
|
| 129 |
onDocumentReady({ scrollToFirstChunk });
|
| 130 |
}
|
| 131 |
+
}, [onDocumentReady, scrollToFirstChunk, highlighterUtilsRef.current]);
|
| 132 |
|
| 133 |
// Utility function to normalize highlight data
|
| 134 |
const normalizeHighlight = (highlightData) => {
|
|
|
|
| 280 |
pdfScaleValue={zoom}
|
| 281 |
utilsRef={(_pdfHighlighterUtils) => {
|
| 282 |
highlighterUtilsRef.current = _pdfHighlighterUtils;
|
|
|
|
| 283 |
}}
|
| 284 |
highlights={highlights}
|
| 285 |
onSelection={handleSelection}
|
|
@@ -1,32 +1,190 @@
|
|
| 1 |
-
import {
|
|
|
|
| 2 |
|
| 3 |
function Homepage() {
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 4 |
return (
|
| 5 |
-
<div className="min-h-screen bg-gray-50 flex items-center justify-center">
|
| 6 |
-
<div className="
|
| 7 |
-
<
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
</
|
| 20 |
-
<Link
|
| 21 |
-
to="/upload"
|
| 22 |
-
className="bg-gray-500 hover:bg-gray-600 text-white font-bold py-3 px-6 rounded-lg transition-colors duration-200 inline-block"
|
| 23 |
-
>
|
| 24 |
-
Legacy Upload
|
| 25 |
-
</Link>
|
| 26 |
</div>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 27 |
</div>
|
| 28 |
</div>
|
| 29 |
);
|
| 30 |
}
|
| 31 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 32 |
export default Homepage;
|
|
|
|
| 1 |
+
import { useState, useRef } from 'react';
|
| 2 |
+
import DocumentProcessor from './DocumentProcessor';
|
| 3 |
|
| 4 |
function Homepage() {
|
| 5 |
+
const [hasFile, setHasFile] = useState(false);
|
| 6 |
+
const [fileName, setFileName] = useState('');
|
| 7 |
+
const [selectedFile, setSelectedFile] = useState(null);
|
| 8 |
+
const [academicBackground, setAcademicBackground] = useState('');
|
| 9 |
+
const [showAbout, setShowAbout] = useState(false);
|
| 10 |
+
const [showDocumentProcessor, setShowDocumentProcessor] = useState(false);
|
| 11 |
+
const fileRef = useRef(null);
|
| 12 |
+
|
| 13 |
+
function openFilePicker() {
|
| 14 |
+
if (fileRef.current) fileRef.current.click();
|
| 15 |
+
}
|
| 16 |
+
|
| 17 |
+
function handleFileChange(event) {
|
| 18 |
+
const file = event.target.files?.[0];
|
| 19 |
+
if (file) {
|
| 20 |
+
setSelectedFile(file);
|
| 21 |
+
setHasFile(true);
|
| 22 |
+
setFileName(file.name);
|
| 23 |
+
setTimeout(() => setShowAbout(true), 250);
|
| 24 |
+
}
|
| 25 |
+
}
|
| 26 |
+
|
| 27 |
+
|
| 28 |
+
function handleContinue() {
|
| 29 |
+
setShowDocumentProcessor(true);
|
| 30 |
+
}
|
| 31 |
+
|
| 32 |
+
// If we're showing the document processor, render it instead of the onboarding
|
| 33 |
+
if (showDocumentProcessor) {
|
| 34 |
+
return (
|
| 35 |
+
<DocumentProcessor
|
| 36 |
+
initialFile={selectedFile}
|
| 37 |
+
fileName={fileName}
|
| 38 |
+
academicBackground={academicBackground}
|
| 39 |
+
/>
|
| 40 |
+
);
|
| 41 |
+
}
|
| 42 |
+
|
| 43 |
return (
|
| 44 |
+
<div className="min-h-screen w-full bg-gray-50 text-gray-900 flex items-center justify-center p-6">
|
| 45 |
+
<div className="max-w-4xl w-full">
|
| 46 |
+
<div className="mb-6 flex items-center justify-between">
|
| 47 |
+
<div className="flex items-center gap-3">
|
| 48 |
+
<div className="h-10 w-10 rounded-2xl bg-indigo-500 text-white flex items-center justify-center">
|
| 49 |
+
<svg viewBox='0 0 24 24' className='h-6 w-6'>
|
| 50 |
+
<path d='M12 3l7.5 4.5v9L12 21 4.5 16.5v-9L12 3z' fill='currentColor'/>
|
| 51 |
+
</svg>
|
| 52 |
+
</div>
|
| 53 |
+
<div>
|
| 54 |
+
<h1 className="text-2xl font-semibold tracking-tight">SocraticAI</h1>
|
| 55 |
+
<p className="text-sm text-gray-600">Guided comprehension for complex papers</p>
|
| 56 |
+
</div>
|
| 57 |
+
</div>
|
| 58 |
+
<div className="text-xs text-gray-500">Onboarding Flow</div>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 59 |
</div>
|
| 60 |
+
|
| 61 |
+
{!hasFile && !showAbout && (
|
| 62 |
+
<Landing
|
| 63 |
+
openFilePicker={openFilePicker}
|
| 64 |
+
fileRef={fileRef}
|
| 65 |
+
handleFileChange={handleFileChange}
|
| 66 |
+
academicBackground={academicBackground}
|
| 67 |
+
setAcademicBackground={setAcademicBackground}
|
| 68 |
+
/>
|
| 69 |
+
)}
|
| 70 |
+
|
| 71 |
+
{hasFile && showAbout && (
|
| 72 |
+
<About
|
| 73 |
+
fileName={fileName}
|
| 74 |
+
onContinue={handleContinue}
|
| 75 |
+
openFilePicker={openFilePicker}
|
| 76 |
+
academicBackground={academicBackground}
|
| 77 |
+
/>
|
| 78 |
+
)}
|
| 79 |
</div>
|
| 80 |
</div>
|
| 81 |
);
|
| 82 |
}
|
| 83 |
|
| 84 |
+
function Landing({ openFilePicker, fileRef, handleFileChange, academicBackground, setAcademicBackground }) {
|
| 85 |
+
return (
|
| 86 |
+
<div className="rounded-2xl border border-gray-200 bg-white shadow-lg p-8 md:p-12">
|
| 87 |
+
<div>
|
| 88 |
+
<h2 className="text-3xl md:text-4xl font-semibold leading-tight text-gray-900">Welcome to SocraticAI</h2>
|
| 89 |
+
<p className="mt-3 text-gray-600">
|
| 90 |
+
Your mentor-like companion for mastering research papers. Upload a paper and we'll turn it into a guided
|
| 91 |
+
learning path — with questions, feedback, and visible progress.
|
| 92 |
+
</p>
|
| 93 |
+
|
| 94 |
+
<div className="mt-6">
|
| 95 |
+
<label htmlFor="academic-background" className="block text-sm font-medium text-gray-700 mb-2">
|
| 96 |
+
Describe your academic background
|
| 97 |
+
</label>
|
| 98 |
+
<p className="text-xs text-gray-500 mb-3">
|
| 99 |
+
This helps us adjust our tutoring AI to adapt to what you already know.
|
| 100 |
+
</p>
|
| 101 |
+
<textarea
|
| 102 |
+
id="academic-background"
|
| 103 |
+
value={academicBackground}
|
| 104 |
+
onChange={(e) => setAcademicBackground(e.target.value)}
|
| 105 |
+
placeholder="e.g., PhD in Computer Science, undergraduate in Physics, working professional in data science..."
|
| 106 |
+
className="w-full px-4 py-3 rounded-lg border border-gray-300 text-gray-900 placeholder-gray-400 focus:outline-none focus:ring-2 focus:ring-indigo-500 focus:border-indigo-500 resize-none"
|
| 107 |
+
rows="3"
|
| 108 |
+
/>
|
| 109 |
+
</div>
|
| 110 |
+
|
| 111 |
+
<div className="mt-8">
|
| 112 |
+
<button
|
| 113 |
+
onClick={openFilePicker}
|
| 114 |
+
className="inline-flex items-center gap-2 rounded-lg px-5 py-3 bg-indigo-500 hover:bg-indigo-600 transition text-white font-medium shadow-lg"
|
| 115 |
+
>
|
| 116 |
+
<UploadIcon /> Upload a paper (PDF)
|
| 117 |
+
</button>
|
| 118 |
+
<input
|
| 119 |
+
ref={fileRef}
|
| 120 |
+
type="file"
|
| 121 |
+
accept="application/pdf"
|
| 122 |
+
className="hidden"
|
| 123 |
+
onChange={handleFileChange}
|
| 124 |
+
/>
|
| 125 |
+
</div>
|
| 126 |
+
<p className="mt-3 text-xs text-gray-500">Your PDF will be processed locally and securely.</p>
|
| 127 |
+
</div>
|
| 128 |
+
</div>
|
| 129 |
+
);
|
| 130 |
+
}
|
| 131 |
+
|
| 132 |
+
function About({ fileName, onContinue, openFilePicker, academicBackground }) {
|
| 133 |
+
return (
|
| 134 |
+
<div className="rounded-2xl border border-gray-200 bg-white shadow-lg p-8 md:p-12">
|
| 135 |
+
<div className="flex items-start gap-4">
|
| 136 |
+
<div className="h-10 w-10 rounded-xl bg-emerald-100 border border-emerald-200 flex items-center justify-center">
|
| 137 |
+
<CheckIcon />
|
| 138 |
+
</div>
|
| 139 |
+
<div className="flex-1">
|
| 140 |
+
<h3 className="text-2xl font-semibold text-gray-900">Paper added</h3>
|
| 141 |
+
<p className="text-gray-600 mt-1">{fileName}</p>
|
| 142 |
+
{academicBackground && (
|
| 143 |
+
<p className="text-gray-500 mt-2 text-sm">
|
| 144 |
+
Background: {academicBackground}
|
| 145 |
+
</p>
|
| 146 |
+
)}
|
| 147 |
+
<p className="mt-4 text-gray-800">Here's how SocraticAI helps you learn deeply:</p>
|
| 148 |
+
<ul className="mt-3 text-gray-600 space-y-2 list-disc list-inside">
|
| 149 |
+
<li><span className="font-medium text-gray-800">Automatic inflection points.</span> We flag hypotheses, assumptions, method shifts, and key claims.</li>
|
| 150 |
+
<li><span className="font-medium text-gray-800">Guided micro-conversations.</span> Short question/answer loops validate and extend your understanding.</li>
|
| 151 |
+
<li><span className="font-medium text-gray-800">Visible progress.</span> Confidence builds as you complete focused checkpoints.</li>
|
| 152 |
+
</ul>
|
| 153 |
+
<div className="mt-6 flex flex-wrap gap-3">
|
| 154 |
+
<button
|
| 155 |
+
onClick={onContinue}
|
| 156 |
+
className="rounded-lg px-5 py-3 bg-indigo-500 hover:bg-indigo-600 transition text-white font-medium shadow-lg"
|
| 157 |
+
>
|
| 158 |
+
Continue
|
| 159 |
+
</button>
|
| 160 |
+
<button
|
| 161 |
+
onClick={openFilePicker}
|
| 162 |
+
className="rounded-lg px-5 py-3 bg-gray-100 hover:bg-gray-200 transition text-gray-700 font-medium border border-gray-300"
|
| 163 |
+
>
|
| 164 |
+
Choose another paper
|
| 165 |
+
</button>
|
| 166 |
+
</div>
|
| 167 |
+
</div>
|
| 168 |
+
</div>
|
| 169 |
+
</div>
|
| 170 |
+
);
|
| 171 |
+
}
|
| 172 |
+
|
| 173 |
+
function UploadIcon() {
|
| 174 |
+
return (
|
| 175 |
+
<svg viewBox="0 0 24 24" className="h-5 w-5" fill="none" stroke="currentColor" strokeWidth="1.8">
|
| 176 |
+
<path d="M12 16V4m0 0l-4 4m4-4l4 4"/>
|
| 177 |
+
<path d="M20 16v2a2 2 0 0 1-2 2H6a2 2 0 0 1-2-2v-2"/>
|
| 178 |
+
</svg>
|
| 179 |
+
);
|
| 180 |
+
}
|
| 181 |
+
|
| 182 |
+
function CheckIcon() {
|
| 183 |
+
return (
|
| 184 |
+
<svg viewBox="0 0 24 24" className="h-5 w-5 text-emerald-600" fill="none" stroke="currentColor" strokeWidth="2">
|
| 185 |
+
<path d="M20 6L9 17l-5-5" />
|
| 186 |
+
</svg>
|
| 187 |
+
);
|
| 188 |
+
}
|
| 189 |
+
|
| 190 |
export default Homepage;
|
|
@@ -0,0 +1,424 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import { useState, useRef, useLayoutEffect, memo } from 'react';
|
| 2 |
+
|
| 3 |
+
// Move constants outside component to avoid re-creation and make them accessible to memoized components
|
| 4 |
+
const scopeOptions = [
|
| 5 |
+
{ key: 'entire_paper', title: 'Entire Paper', desc: 'I want to understand the whole paper' },
|
| 6 |
+
{ key: 'specific_section', title: 'Specific Section', desc: 'Focus on a particular section, chapter, equation, or formula' }
|
| 7 |
+
];
|
| 8 |
+
|
| 9 |
+
const depthOptions = [
|
| 10 |
+
{ key: 'gist', title: 'Get the gist', desc: 'High-level understanding and main takeaways' },
|
| 11 |
+
{ key: 'working_understanding', title: 'Working understanding', desc: 'Detailed comprehension I can discuss and apply' },
|
| 12 |
+
{ key: 'reproduce', title: 'Reproduce results', desc: 'Deep enough to implement or reproduce the work' }
|
| 13 |
+
];
|
| 14 |
+
|
| 15 |
+
const styleOptions = [
|
| 16 |
+
{ key: 'concepts', title: 'Concepts', desc: 'Focus on ideas, theories, and conceptual understanding' },
|
| 17 |
+
{ key: 'mathematics', title: 'Mathematics', desc: 'Emphasize equations, proofs, and mathematical reasoning' },
|
| 18 |
+
{ key: 'methods', title: 'Methods', desc: 'Concentrate on procedures, techniques, and implementation' },
|
| 19 |
+
{ key: 'figures', title: 'Figures', desc: 'Focus on charts, diagrams, graphs, and visual elements' }
|
| 20 |
+
];
|
| 21 |
+
|
| 22 |
+
const chunkSizeOptions = [
|
| 23 |
+
{ key: 'small', title: 'Small chunks', desc: 'Short, focused sections with frequent checkpoints' },
|
| 24 |
+
{ key: 'medium', title: 'Medium chunks', desc: 'Balanced pace and depth' },
|
| 25 |
+
{ key: 'large', title: 'Large chunks', desc: 'Longer sections, faster pace' }
|
| 26 |
+
];
|
| 27 |
+
|
| 28 |
+
const chunkingOptions = [
|
| 29 |
+
{ key: 'guided', title: 'Generate learning chunks for me', badge: 'AI-Powered', desc: 'We\'ll automatically break down the paper into optimal learning sections' },
|
| 30 |
+
{ key: 'manual', title: 'I\'ll highlight sections myself', desc: 'Use highlighting tools to select what you want to focus on' }
|
| 31 |
+
];
|
| 32 |
+
|
| 33 |
+
const familiarityLabels = ['New to it', 'Somewhat new', 'Comfortable', 'Very familiar', 'I\'ve taught this'];
|
| 34 |
+
|
| 35 |
+
const OnboardingWizard = ({ fileName, academicBackground, onComplete }) => {
|
| 36 |
+
const [currentStep, setCurrentStep] = useState('scope');
|
| 37 |
+
const [scope, setScope] = useState(null);
|
| 38 |
+
const [scopeDetails, setScopeDetails] = useState('');
|
| 39 |
+
const [depth, setDepth] = useState(null);
|
| 40 |
+
const [style, setStyle] = useState(null);
|
| 41 |
+
const [familiarity, setFamiliarity] = useState(2);
|
| 42 |
+
const [chunkSize, setChunkSize] = useState('medium');
|
| 43 |
+
const [useChunking, setUseChunking] = useState(null);
|
| 44 |
+
const [familiarityText, setFamiliarityText] = useState('');
|
| 45 |
+
const [isLoading, setIsLoading] = useState(false);
|
| 46 |
+
const [loadingPhase, setLoadingPhase] = useState(0);
|
| 47 |
+
|
| 48 |
+
const stepNumbers = {
|
| 49 |
+
scope: 1,
|
| 50 |
+
depth: 2,
|
| 51 |
+
style: 3,
|
| 52 |
+
chunking: 4,
|
| 53 |
+
familiarity: 5
|
| 54 |
+
};
|
| 55 |
+
|
| 56 |
+
const totalSteps = 5;
|
| 57 |
+
const stepNumber = stepNumbers[currentStep];
|
| 58 |
+
const progressPct = (stepNumber / totalSteps) * 100;
|
| 59 |
+
|
| 60 |
+
function onNext() {
|
| 61 |
+
// Force sync of local state before navigating
|
| 62 |
+
if (currentStep === 'scope' && scope === 'specific_section') {
|
| 63 |
+
// Make sure the textarea is synced before navigation
|
| 64 |
+
const textarea = document.getElementById('scope-details');
|
| 65 |
+
if (textarea) {
|
| 66 |
+
setScopeDetails(textarea.value);
|
| 67 |
+
}
|
| 68 |
+
}
|
| 69 |
+
|
| 70 |
+
if (currentStep === 'scope') setCurrentStep('depth');
|
| 71 |
+
else if (currentStep === 'depth') setCurrentStep('style');
|
| 72 |
+
else if (currentStep === 'style') setCurrentStep('chunking');
|
| 73 |
+
else if (currentStep === 'chunking') setCurrentStep('familiarity');
|
| 74 |
+
}
|
| 75 |
+
|
| 76 |
+
function onBack() {
|
| 77 |
+
if (currentStep === 'depth') setCurrentStep('scope');
|
| 78 |
+
else if (currentStep === 'style') setCurrentStep('depth');
|
| 79 |
+
else if (currentStep === 'chunking') setCurrentStep('style');
|
| 80 |
+
else if (currentStep === 'familiarity') setCurrentStep('chunking');
|
| 81 |
+
}
|
| 82 |
+
|
| 83 |
+
function handleStart() {
|
| 84 |
+
setIsLoading(true);
|
| 85 |
+
setLoadingPhase(0);
|
| 86 |
+
|
| 87 |
+
// Simulate loading phases
|
| 88 |
+
const phases = [1200, 1200, 1400];
|
| 89 |
+
let i = 0;
|
| 90 |
+
const timer = setInterval(() => {
|
| 91 |
+
setLoadingPhase(prev => prev + 1);
|
| 92 |
+
i += 1;
|
| 93 |
+
if (i >= phases.length) {
|
| 94 |
+
clearInterval(timer);
|
| 95 |
+
// Call onComplete with all the collected data
|
| 96 |
+
setTimeout(() => {
|
| 97 |
+
onComplete({
|
| 98 |
+
scope,
|
| 99 |
+
scopeDetails,
|
| 100 |
+
depth,
|
| 101 |
+
style,
|
| 102 |
+
familiarity,
|
| 103 |
+
chunkSize,
|
| 104 |
+
useChunking,
|
| 105 |
+
familiarityText,
|
| 106 |
+
academicBackground
|
| 107 |
+
});
|
| 108 |
+
}, 1000);
|
| 109 |
+
}
|
| 110 |
+
}, phases[0]);
|
| 111 |
+
}
|
| 112 |
+
|
| 113 |
+
if (isLoading) {
|
| 114 |
+
return <LoadingScreen fileName={fileName} phase={loadingPhase} />;
|
| 115 |
+
}
|
| 116 |
+
|
| 117 |
+
return (
|
| 118 |
+
<div className="h-full flex flex-col bg-white">
|
| 119 |
+
<div className="border border-gray-200 rounded-t-lg overflow-hidden flex-1 flex flex-col">
|
| 120 |
+
<div className="h-2 bg-gray-200">
|
| 121 |
+
<div className="h-2 bg-indigo-500 transition-all duration-500" style={{ width: progressPct + '%' }} />
|
| 122 |
+
</div>
|
| 123 |
+
|
| 124 |
+
<div className="p-6 flex-1 flex flex-col">
|
| 125 |
+
<div className="flex items-center justify-between mb-6">
|
| 126 |
+
<div className="text-sm text-gray-500">Step {stepNumber} of {totalSteps}</div>
|
| 127 |
+
<div className="text-sm text-gray-500">Setup</div>
|
| 128 |
+
</div>
|
| 129 |
+
|
| 130 |
+
<div className="flex-1">
|
| 131 |
+
{currentStep === 'scope' && <StepScope scope={scope} setScope={setScope} initialScopeDetails={scopeDetails} onScopeDetailsChange={setScopeDetails} />}
|
| 132 |
+
{currentStep === 'depth' && <StepDepth depth={depth} setDepth={setDepth} />}
|
| 133 |
+
{currentStep === 'style' && <StepStyle style={style} setStyle={setStyle} />}
|
| 134 |
+
{currentStep === 'chunking' && <StepChunking useChunking={useChunking} setUseChunking={setUseChunking} chunkSize={chunkSize} setChunkSize={setChunkSize} />}
|
| 135 |
+
{currentStep === 'familiarity' && (
|
| 136 |
+
<StepFamiliarity
|
| 137 |
+
familiarity={familiarity}
|
| 138 |
+
setFamiliarity={setFamiliarity}
|
| 139 |
+
familiarityText={familiarityText}
|
| 140 |
+
setFamiliarityText={setFamiliarityText}
|
| 141 |
+
academicBackground={academicBackground}
|
| 142 |
+
/>
|
| 143 |
+
)}
|
| 144 |
+
</div>
|
| 145 |
+
|
| 146 |
+
<div className="mt-6 flex items-center justify-between">
|
| 147 |
+
<button
|
| 148 |
+
onClick={onBack}
|
| 149 |
+
className="rounded-lg px-4 py-2 bg-gray-100 hover:bg-gray-200 border border-gray-300 text-gray-700 transition-colors"
|
| 150 |
+
disabled={currentStep === 'scope'}
|
| 151 |
+
>
|
| 152 |
+
Back
|
| 153 |
+
</button>
|
| 154 |
+
|
| 155 |
+
{currentStep !== 'familiarity' && (
|
| 156 |
+
<button
|
| 157 |
+
onClick={onNext}
|
| 158 |
+
disabled={(currentStep === 'scope' && (!scope || (scope === 'specific_section' && !scopeDetails.trim()))) ||
|
| 159 |
+
(currentStep === 'depth' && !depth) ||
|
| 160 |
+
(currentStep === 'style' && !style) ||
|
| 161 |
+
(currentStep === 'chunking' && !useChunking)}
|
| 162 |
+
className="rounded-lg px-5 py-2.5 bg-indigo-500 disabled:bg-indigo-300 hover:bg-indigo-600 transition text-white font-medium shadow-md disabled:cursor-not-allowed"
|
| 163 |
+
>
|
| 164 |
+
Next
|
| 165 |
+
</button>
|
| 166 |
+
)}
|
| 167 |
+
|
| 168 |
+
{currentStep === 'familiarity' && (
|
| 169 |
+
<button
|
| 170 |
+
onClick={handleStart}
|
| 171 |
+
className="rounded-lg px-5 py-2.5 bg-emerald-500 hover:bg-emerald-600 transition text-white font-medium shadow-md"
|
| 172 |
+
>
|
| 173 |
+
Let's go — start
|
| 174 |
+
</button>
|
| 175 |
+
)}
|
| 176 |
+
</div>
|
| 177 |
+
</div>
|
| 178 |
+
</div>
|
| 179 |
+
</div>
|
| 180 |
+
);
|
| 181 |
+
function StepDepth({ depth, setDepth }) {
|
| 182 |
+
return (
|
| 183 |
+
<div>
|
| 184 |
+
<h3 className="text-2xl font-semibold text-gray-900">How deep do you want to go?</h3>
|
| 185 |
+
<p className="mt-2 text-gray-600">Select the level of understanding you're aiming for.</p>
|
| 186 |
+
<div className="mt-6 space-y-3">
|
| 187 |
+
{depthOptions.map(option => (
|
| 188 |
+
<SelectableCard
|
| 189 |
+
key={option.key}
|
| 190 |
+
selected={depth === option.key}
|
| 191 |
+
onClick={() => setDepth(option.key)}
|
| 192 |
+
title={option.title}
|
| 193 |
+
desc={option.desc}
|
| 194 |
+
/>
|
| 195 |
+
))}
|
| 196 |
+
</div>
|
| 197 |
+
</div>
|
| 198 |
+
);
|
| 199 |
+
}
|
| 200 |
+
|
| 201 |
+
function StepStyle({ style, setStyle }) {
|
| 202 |
+
return (
|
| 203 |
+
<div>
|
| 204 |
+
<h3 className="text-2xl font-semibold text-gray-900">What's your learning style preference?</h3>
|
| 205 |
+
<p className="mt-2 text-gray-600">Choose the approach that works best for you.</p>
|
| 206 |
+
<div className="mt-6 space-y-3">
|
| 207 |
+
{styleOptions.map(option => (
|
| 208 |
+
<SelectableCard
|
| 209 |
+
key={option.key}
|
| 210 |
+
selected={style === option.key}
|
| 211 |
+
onClick={() => setStyle(option.key)}
|
| 212 |
+
title={option.title}
|
| 213 |
+
desc={option.desc}
|
| 214 |
+
/>
|
| 215 |
+
))}
|
| 216 |
+
</div>
|
| 217 |
+
</div>
|
| 218 |
+
);
|
| 219 |
+
}
|
| 220 |
+
|
| 221 |
+
function StepChunking({ useChunking, setUseChunking, chunkSize, setChunkSize }) {
|
| 222 |
+
return (
|
| 223 |
+
<div>
|
| 224 |
+
<h3 className="text-2xl font-semibold text-gray-900">How would you like to structure your learning?</h3>
|
| 225 |
+
<p className="mt-2 text-gray-600">Choose how you want to break down the content.</p>
|
| 226 |
+
<div className="mt-6 space-y-4">
|
| 227 |
+
{chunkingOptions.map(option => (
|
| 228 |
+
<SelectableCard
|
| 229 |
+
key={option.key}
|
| 230 |
+
selected={useChunking === option.key}
|
| 231 |
+
onClick={() => setUseChunking(option.key)}
|
| 232 |
+
title={option.title}
|
| 233 |
+
desc={option.desc}
|
| 234 |
+
badge={option.badge}
|
| 235 |
+
/>
|
| 236 |
+
))}
|
| 237 |
+
</div>
|
| 238 |
+
|
| 239 |
+
{useChunking === 'guided' && (
|
| 240 |
+
<div className="mt-6">
|
| 241 |
+
<h4 className="text-lg font-medium text-gray-800 mb-3">Preferred chunk size</h4>
|
| 242 |
+
<div className="space-y-3">
|
| 243 |
+
{chunkSizeOptions.map(option => (
|
| 244 |
+
<SelectableCard
|
| 245 |
+
key={option.key}
|
| 246 |
+
selected={chunkSize === option.key}
|
| 247 |
+
onClick={() => setChunkSize(option.key)}
|
| 248 |
+
title={option.title}
|
| 249 |
+
desc={option.desc}
|
| 250 |
+
/>
|
| 251 |
+
))}
|
| 252 |
+
</div>
|
| 253 |
+
</div>
|
| 254 |
+
)}
|
| 255 |
+
</div>
|
| 256 |
+
);
|
| 257 |
+
}
|
| 258 |
+
|
| 259 |
+
function StepFamiliarity({ familiarity, setFamiliarity, familiarityText, setFamiliarityText, academicBackground }) {
|
| 260 |
+
return (
|
| 261 |
+
<div>
|
| 262 |
+
<h3 className="text-2xl font-semibold text-gray-900">How familiar are you with this topic?</h3>
|
| 263 |
+
<p className="mt-2 text-gray-600">This helps us adjust our teaching approach.</p>
|
| 264 |
+
|
| 265 |
+
{academicBackground && (
|
| 266 |
+
<div className="mt-4 p-3 bg-gray-50 rounded-lg border border-gray-200">
|
| 267 |
+
<p className="text-sm text-gray-600">Your background: <span className="text-gray-800">{academicBackground}</span></p>
|
| 268 |
+
</div>
|
| 269 |
+
)}
|
| 270 |
+
|
| 271 |
+
<div className="mt-6">
|
| 272 |
+
<input
|
| 273 |
+
type="range"
|
| 274 |
+
min={0}
|
| 275 |
+
max={4}
|
| 276 |
+
step={1}
|
| 277 |
+
value={familiarity}
|
| 278 |
+
onChange={e => setFamiliarity(parseInt(e.target.value))}
|
| 279 |
+
className="w-full accent-indigo-500"
|
| 280 |
+
/>
|
| 281 |
+
<div className="flex justify-between text-xs text-gray-500 mt-2">
|
| 282 |
+
{familiarityLabels.map((label, i) => (
|
| 283 |
+
<div key={i} className={`text-center ${i === familiarity ? 'text-gray-800 font-medium' : ''}`}>
|
| 284 |
+
{label}
|
| 285 |
+
</div>
|
| 286 |
+
))}
|
| 287 |
+
</div>
|
| 288 |
+
</div>
|
| 289 |
+
|
| 290 |
+
<div className="mt-6">
|
| 291 |
+
<label htmlFor="familiarity-details" className="block text-sm font-medium text-gray-700 mb-2">
|
| 292 |
+
Additional context (optional)
|
| 293 |
+
</label>
|
| 294 |
+
<textarea
|
| 295 |
+
id="familiarity-details"
|
| 296 |
+
value={familiarityText}
|
| 297 |
+
onChange={(e) => setFamiliarityText(e.target.value)}
|
| 298 |
+
placeholder="Any specific knowledge, challenges, or questions about this topic..."
|
| 299 |
+
className="w-full px-4 py-3 rounded-lg border border-gray-300 text-gray-900 placeholder-gray-400 focus:outline-none focus:ring-2 focus:ring-indigo-500 focus:border-indigo-500 resize-none"
|
| 300 |
+
rows="3"
|
| 301 |
+
/>
|
| 302 |
+
</div>
|
| 303 |
+
</div>
|
| 304 |
+
);
|
| 305 |
+
}
|
| 306 |
+
};
|
| 307 |
+
|
| 308 |
+
function LoadingScreen({ fileName, phase }) {
|
| 309 |
+
const messages = [
|
| 310 |
+
'Analyzing your paper...',
|
| 311 |
+
'Setting up your learning path...',
|
| 312 |
+
'Preparing personalized tutoring...',
|
| 313 |
+
'All set. Launching your tutor...'
|
| 314 |
+
];
|
| 315 |
+
const message = messages[Math.min(phase, messages.length - 1)];
|
| 316 |
+
|
| 317 |
+
return (
|
| 318 |
+
<div className="h-full flex items-center justify-center p-10 bg-white">
|
| 319 |
+
<div className="text-center">
|
| 320 |
+
<div className="mx-auto h-16 w-16 rounded-full border-4 border-gray-200 border-t-indigo-500 animate-spin" />
|
| 321 |
+
<h3 className="mt-6 text-2xl font-semibold text-gray-900">{message}</h3>
|
| 322 |
+
<p className="mt-2 text-gray-600">{fileName}</p>
|
| 323 |
+
<p className="mt-6 text-xs text-gray-500">Setting up your personalized learning experience...</p>
|
| 324 |
+
</div>
|
| 325 |
+
</div>
|
| 326 |
+
);
|
| 327 |
+
}
|
| 328 |
+
|
| 329 |
+
function SelectableCard({ selected, onClick, title, desc, badge }) {
|
| 330 |
+
return (
|
| 331 |
+
<button
|
| 332 |
+
type="button"
|
| 333 |
+
// Prevent the button from taking focus on mouse/pointer down
|
| 334 |
+
onMouseDown={(e) => e.preventDefault()}
|
| 335 |
+
onPointerDown={(e) => e.preventDefault()}
|
| 336 |
+
onClick={onClick}
|
| 337 |
+
className={`text-left rounded-lg border transition p-4 hover:-translate-y-0.5 active:translate-y-0 bg-white hover:bg-gray-50 w-full ${
|
| 338 |
+
selected ? 'border-indigo-500 ring-2 ring-indigo-200 bg-indigo-50' : 'border-gray-200'
|
| 339 |
+
}`}
|
| 340 |
+
>
|
| 341 |
+
<div className="flex items-start justify-between">
|
| 342 |
+
<div className="text-base font-medium text-gray-900">{title}</div>
|
| 343 |
+
{badge && (
|
| 344 |
+
<span className="text-[10px] uppercase tracking-wide bg-indigo-100 text-indigo-700 px-2 py-1 rounded-md border border-indigo-200">
|
| 345 |
+
{badge}
|
| 346 |
+
</span>
|
| 347 |
+
)}
|
| 348 |
+
</div>
|
| 349 |
+
{desc && <p className="text-sm text-gray-600 mt-1">{desc}</p>}
|
| 350 |
+
</button>
|
| 351 |
+
);
|
| 352 |
+
}
|
| 353 |
+
|
| 354 |
+
// Memoized StepScope to prevent re-renders from parent
|
| 355 |
+
const StepScope = memo(({ scope, setScope, initialScopeDetails, onScopeDetailsChange }) => {
|
| 356 |
+
// Local state for immediate UI updates - doesn't trigger parent re-renders
|
| 357 |
+
const [localScopeDetails, setLocalScopeDetails] = useState(initialScopeDetails);
|
| 358 |
+
const localScopeDetailsRef = useRef(null);
|
| 359 |
+
|
| 360 |
+
// Auto-focus when textarea appears
|
| 361 |
+
useLayoutEffect(() => {
|
| 362 |
+
if (scope === 'specific_section' && localScopeDetailsRef.current) {
|
| 363 |
+
const textarea = localScopeDetailsRef.current;
|
| 364 |
+
textarea.focus();
|
| 365 |
+
textarea.setSelectionRange(textarea.value.length, textarea.value.length);
|
| 366 |
+
}
|
| 367 |
+
}, [scope]);
|
| 368 |
+
|
| 369 |
+
// Sync local state to parent when user finishes typing (onBlur) or navigates
|
| 370 |
+
const handleBlur = () => {
|
| 371 |
+
onScopeDetailsChange(localScopeDetails);
|
| 372 |
+
};
|
| 373 |
+
|
| 374 |
+
// Update parent immediately when switching scope types
|
| 375 |
+
const handleScopeChange = (optionKey) => {
|
| 376 |
+
setScope(optionKey);
|
| 377 |
+
if (optionKey !== 'specific_section') {
|
| 378 |
+
setLocalScopeDetails('');
|
| 379 |
+
onScopeDetailsChange('');
|
| 380 |
+
}
|
| 381 |
+
};
|
| 382 |
+
|
| 383 |
+
return (
|
| 384 |
+
<div>
|
| 385 |
+
<h3 className="text-2xl font-semibold text-gray-900">What's the scope of your learning?</h3>
|
| 386 |
+
<p className="mt-2 text-gray-600">Choose what you want to focus on in this paper.</p>
|
| 387 |
+
<div className="mt-6 space-y-3">
|
| 388 |
+
{scopeOptions.map(option => (
|
| 389 |
+
<SelectableCard
|
| 390 |
+
key={option.key}
|
| 391 |
+
selected={scope === option.key}
|
| 392 |
+
onClick={() => handleScopeChange(option.key)}
|
| 393 |
+
title={option.title}
|
| 394 |
+
desc={option.desc}
|
| 395 |
+
/>
|
| 396 |
+
))}
|
| 397 |
+
|
| 398 |
+
{scope === 'specific_section' && (
|
| 399 |
+
<div className="mt-3 ml-4">
|
| 400 |
+
<label htmlFor="scope-details" className="block text-sm font-medium text-gray-700 mb-2">
|
| 401 |
+
Which section would you like to focus on? <span className="text-red-500">*</span>
|
| 402 |
+
</label>
|
| 403 |
+
<textarea
|
| 404 |
+
ref={localScopeDetailsRef}
|
| 405 |
+
id="scope-details"
|
| 406 |
+
value={localScopeDetails}
|
| 407 |
+
onChange={(e) => setLocalScopeDetails(e.target.value)} // Only local state change!
|
| 408 |
+
onBlur={handleBlur} // Sync to parent when done typing
|
| 409 |
+
placeholder="e.g., Introduction, Methods section, Equation 3.2, Figure 4 analysis, Discussion of results..."
|
| 410 |
+
className="w-full px-4 py-3 rounded-lg border border-gray-300 text-gray-900 placeholder-gray-400 focus:outline-none focus:ring-2 focus:ring-indigo-500 focus:border-indigo-500 resize-none"
|
| 411 |
+
rows="2"
|
| 412 |
+
required
|
| 413 |
+
/>
|
| 414 |
+
{!localScopeDetails.trim() && (
|
| 415 |
+
<p className="mt-1 text-sm text-red-500">Please specify which section you want to focus on.</p>
|
| 416 |
+
)}
|
| 417 |
+
</div>
|
| 418 |
+
)}
|
| 419 |
+
</div>
|
| 420 |
+
</div>
|
| 421 |
+
);
|
| 422 |
+
});
|
| 423 |
+
|
| 424 |
+
export default OnboardingWizard;
|
|
@@ -0,0 +1,2260 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"0": [{
|
| 3 |
+
"id": "highlight_1755775800949",
|
| 4 |
+
"position": {
|
| 5 |
+
"boundingRect": {
|
| 6 |
+
"x1": 144.01666259765625,
|
| 7 |
+
"y1": 130.43328857421875,
|
| 8 |
+
"x2": 673.6499633789062,
|
| 9 |
+
"y2": 328.26666259765625,
|
| 10 |
+
"width": 816,
|
| 11 |
+
"height": 1056,
|
| 12 |
+
"pageNumber": 2
|
| 13 |
+
},
|
| 14 |
+
"rects": [
|
| 15 |
+
{
|
| 16 |
+
"x1": 144.01666259765625,
|
| 17 |
+
"y1": 130.43328857421875,
|
| 18 |
+
"x2": 672.1999816894531,
|
| 19 |
+
"y2": 146.43328857421875,
|
| 20 |
+
"width": 816,
|
| 21 |
+
"height": 1056,
|
| 22 |
+
"pageNumber": 2
|
| 23 |
+
},
|
| 24 |
+
{
|
| 25 |
+
"x1": 144.01666259765625,
|
| 26 |
+
"y1": 145,
|
| 27 |
+
"x2": 673.316650390625,
|
| 28 |
+
"y2": 161,
|
| 29 |
+
"width": 816,
|
| 30 |
+
"height": 1056,
|
| 31 |
+
"pageNumber": 2
|
| 32 |
+
},
|
| 33 |
+
{
|
| 34 |
+
"x1": 144.01666259765625,
|
| 35 |
+
"y1": 159.4666748046875,
|
| 36 |
+
"x2": 672.1499786376953,
|
| 37 |
+
"y2": 175.4666748046875,
|
| 38 |
+
"width": 816,
|
| 39 |
+
"height": 1056,
|
| 40 |
+
"pageNumber": 2
|
| 41 |
+
},
|
| 42 |
+
{
|
| 43 |
+
"x1": 144.01666259765625,
|
| 44 |
+
"y1": 174.04998779296875,
|
| 45 |
+
"x2": 673.4833374023438,
|
| 46 |
+
"y2": 190.04998779296875,
|
| 47 |
+
"width": 816,
|
| 48 |
+
"height": 1056,
|
| 49 |
+
"pageNumber": 2
|
| 50 |
+
},
|
| 51 |
+
{
|
| 52 |
+
"x1": 144.01666259765625,
|
| 53 |
+
"y1": 188.61663818359375,
|
| 54 |
+
"x2": 280.8000030517578,
|
| 55 |
+
"y2": 204.61663818359375,
|
| 56 |
+
"width": 816,
|
| 57 |
+
"height": 1056,
|
| 58 |
+
"pageNumber": 2
|
| 59 |
+
},
|
| 60 |
+
{
|
| 61 |
+
"x1": 144.01666259765625,
|
| 62 |
+
"y1": 210.48333740234375,
|
| 63 |
+
"x2": 673.2999877929688,
|
| 64 |
+
"y2": 226.48333740234375,
|
| 65 |
+
"width": 816,
|
| 66 |
+
"height": 1056,
|
| 67 |
+
"pageNumber": 2
|
| 68 |
+
},
|
| 69 |
+
{
|
| 70 |
+
"x1": 144.01666259765625,
|
| 71 |
+
"y1": 225.04998779296875,
|
| 72 |
+
"x2": 673.25,
|
| 73 |
+
"y2": 241.04998779296875,
|
| 74 |
+
"width": 816,
|
| 75 |
+
"height": 1056,
|
| 76 |
+
"pageNumber": 2
|
| 77 |
+
},
|
| 78 |
+
{
|
| 79 |
+
"x1": 144.01666259765625,
|
| 80 |
+
"y1": 239.51666259765625,
|
| 81 |
+
"x2": 421.50001525878906,
|
| 82 |
+
"y2": 255.51666259765625,
|
| 83 |
+
"width": 816,
|
| 84 |
+
"height": 1056,
|
| 85 |
+
"pageNumber": 2
|
| 86 |
+
},
|
| 87 |
+
{
|
| 88 |
+
"x1": 443.81666564941406,
|
| 89 |
+
"y1": 239.51666259765625,
|
| 90 |
+
"x2": 672.6500091552734,
|
| 91 |
+
"y2": 255.51666259765625,
|
| 92 |
+
"width": 816,
|
| 93 |
+
"height": 1056,
|
| 94 |
+
"pageNumber": 2
|
| 95 |
+
},
|
| 96 |
+
{
|
| 97 |
+
"x1": 184.1666717529297,
|
| 98 |
+
"y1": 245.08331298828125,
|
| 99 |
+
"x2": 187.2833251953125,
|
| 100 |
+
"y2": 256.28330993652344,
|
| 101 |
+
"width": 816,
|
| 102 |
+
"height": 1056,
|
| 103 |
+
"pageNumber": 2
|
| 104 |
+
},
|
| 105 |
+
{
|
| 106 |
+
"x1": 422.18333435058594,
|
| 107 |
+
"y1": 245.08331298828125,
|
| 108 |
+
"x2": 441.9166717529297,
|
| 109 |
+
"y2": 256.28330993652344,
|
| 110 |
+
"width": 816,
|
| 111 |
+
"height": 1056,
|
| 112 |
+
"pageNumber": 2
|
| 113 |
+
},
|
| 114 |
+
{
|
| 115 |
+
"x1": 144.01666259765625,
|
| 116 |
+
"y1": 254.08331298828125,
|
| 117 |
+
"x2": 673.6499633789062,
|
| 118 |
+
"y2": 270.08331298828125,
|
| 119 |
+
"width": 816,
|
| 120 |
+
"height": 1056,
|
| 121 |
+
"pageNumber": 2
|
| 122 |
+
},
|
| 123 |
+
{
|
| 124 |
+
"x1": 144.01666259765625,
|
| 125 |
+
"y1": 268.66668701171875,
|
| 126 |
+
"x2": 673.3999633789062,
|
| 127 |
+
"y2": 284.66668701171875,
|
| 128 |
+
"width": 816,
|
| 129 |
+
"height": 1056,
|
| 130 |
+
"pageNumber": 2
|
| 131 |
+
},
|
| 132 |
+
{
|
| 133 |
+
"x1": 144.01666259765625,
|
| 134 |
+
"y1": 283.23333740234375,
|
| 135 |
+
"x2": 672.1999969482422,
|
| 136 |
+
"y2": 299.23333740234375,
|
| 137 |
+
"width": 816,
|
| 138 |
+
"height": 1056,
|
| 139 |
+
"pageNumber": 2
|
| 140 |
+
},
|
| 141 |
+
{
|
| 142 |
+
"x1": 144.01666259765625,
|
| 143 |
+
"y1": 297.70001220703125,
|
| 144 |
+
"x2": 673.1000213623047,
|
| 145 |
+
"y2": 313.70001220703125,
|
| 146 |
+
"width": 816,
|
| 147 |
+
"height": 1056,
|
| 148 |
+
"pageNumber": 2
|
| 149 |
+
},
|
| 150 |
+
{
|
| 151 |
+
"x1": 144.01666259765625,
|
| 152 |
+
"y1": 312.26666259765625,
|
| 153 |
+
"x2": 441.433349609375,
|
| 154 |
+
"y2": 328.26666259765625,
|
| 155 |
+
"width": 816,
|
| 156 |
+
"height": 1056,
|
| 157 |
+
"pageNumber": 2
|
| 158 |
+
}
|
| 159 |
+
]
|
| 160 |
+
},
|
| 161 |
+
"content": {
|
| 162 |
+
"text": "Recurrent neural networks, long short-term memory [13] and gated recurrent [7] neural networks\r in particular, have been firmly established as state of the art approaches in sequence modeling and\r transduction problems such as language modeling and machine translation [ 35 , 2 , 5]. Numerous\r efforts have since continued to push the boundaries of recurrent language models and encoder-decoder\r architectures [38, 24, 15].\r Recurrent models typically factor computation along the symbol positions of the input and output\r sequences. Aligning the positions to steps in computation time, they generate a sequence of hidden\r states ht, as a function of the previous hidden state ht−1 and the input for position t. This inherently\r sequential nature precludes parallelization within training examples, which becomes critical at longer\r sequence lengths, as memory constraints limit batching across examples. Recent work has achieved\r significant improvements in computational efficiency through factorization tricks [ 21 ] and conditional\r computation [ 32 ], while also improving model performance in case of the latter. The fundamental\r constraint of sequential computation, however, remains."
|
| 163 |
+
}
|
| 164 |
+
}, {
|
| 165 |
+
"id": "highlight_1755775878721",
|
| 166 |
+
"position": {
|
| 167 |
+
"boundingRect": {
|
| 168 |
+
"x1": 144.01666259765625,
|
| 169 |
+
"y1": 399.6000061035156,
|
| 170 |
+
"x2": 675.6499633789062,
|
| 171 |
+
"y2": 430.1833190917969,
|
| 172 |
+
"width": 816,
|
| 173 |
+
"height": 1056,
|
| 174 |
+
"pageNumber": 2
|
| 175 |
+
},
|
| 176 |
+
"rects": [
|
| 177 |
+
{
|
| 178 |
+
"x1": 144.01666259765625,
|
| 179 |
+
"y1": 399.6000061035156,
|
| 180 |
+
"x2": 673.2999877929688,
|
| 181 |
+
"y2": 415.6000061035156,
|
| 182 |
+
"width": 816,
|
| 183 |
+
"height": 1056,
|
| 184 |
+
"pageNumber": 2
|
| 185 |
+
},
|
| 186 |
+
{
|
| 187 |
+
"x1": 144.01666259765625,
|
| 188 |
+
"y1": 414.1833190917969,
|
| 189 |
+
"x2": 675.6499633789062,
|
| 190 |
+
"y2": 430.1833190917969,
|
| 191 |
+
"width": 816,
|
| 192 |
+
"height": 1056,
|
| 193 |
+
"pageNumber": 2
|
| 194 |
+
}
|
| 195 |
+
]
|
| 196 |
+
},
|
| 197 |
+
"content": {
|
| 198 |
+
"text": "In this work we propose the Transformer, a model architecture eschewing recurrence and instead\r relying entirely on an attention mechanism to draw global dependencies between input and output."
|
| 199 |
+
}
|
| 200 |
+
}],
|
| 201 |
+
"1":[{
|
| 202 |
+
"id": "highlight_1755775928209",
|
| 203 |
+
"position": {
|
| 204 |
+
"boundingRect": {
|
| 205 |
+
"x1": 143.5333251953125,
|
| 206 |
+
"y1": 580.7166595458984,
|
| 207 |
+
"x2": 675.6000366210938,
|
| 208 |
+
"y2": 890.38330078125,
|
| 209 |
+
"width": 816,
|
| 210 |
+
"height": 1056,
|
| 211 |
+
"pageNumber": 3
|
| 212 |
+
},
|
| 213 |
+
"rects": [
|
| 214 |
+
{
|
| 215 |
+
"x1": 143.60000610351562,
|
| 216 |
+
"y1": 580.7166595458984,
|
| 217 |
+
"x2": 673.8500061035156,
|
| 218 |
+
"y2": 596.7166595458984,
|
| 219 |
+
"width": 816,
|
| 220 |
+
"height": 1056,
|
| 221 |
+
"pageNumber": 3
|
| 222 |
+
},
|
| 223 |
+
{
|
| 224 |
+
"x1": 144.01666259765625,
|
| 225 |
+
"y1": 595.2833251953125,
|
| 226 |
+
"x2": 674.9166870117188,
|
| 227 |
+
"y2": 611.2833251953125,
|
| 228 |
+
"width": 816,
|
| 229 |
+
"height": 1056,
|
| 230 |
+
"pageNumber": 3
|
| 231 |
+
},
|
| 232 |
+
{
|
| 233 |
+
"x1": 144.01666259765625,
|
| 234 |
+
"y1": 609.7499847412109,
|
| 235 |
+
"x2": 210.3000030517578,
|
| 236 |
+
"y2": 625.7499847412109,
|
| 237 |
+
"width": 816,
|
| 238 |
+
"height": 1056,
|
| 239 |
+
"pageNumber": 3
|
| 240 |
+
},
|
| 241 |
+
{
|
| 242 |
+
"x1": 144.01666259765625,
|
| 243 |
+
"y1": 642.3833160400391,
|
| 244 |
+
"x2": 163.5833282470703,
|
| 245 |
+
"y2": 658.3833160400391,
|
| 246 |
+
"width": 816,
|
| 247 |
+
"height": 1056,
|
| 248 |
+
"pageNumber": 3
|
| 249 |
+
},
|
| 250 |
+
{
|
| 251 |
+
"x1": 173.88333129882812,
|
| 252 |
+
"y1": 642.3833160400391,
|
| 253 |
+
"x2": 337.6999969482422,
|
| 254 |
+
"y2": 658.3833160400391,
|
| 255 |
+
"width": 816,
|
| 256 |
+
"height": 1056,
|
| 257 |
+
"pageNumber": 3
|
| 258 |
+
},
|
| 259 |
+
{
|
| 260 |
+
"x1": 144.01666259765625,
|
| 261 |
+
"y1": 669.0999908447266,
|
| 262 |
+
"x2": 200.46665954589844,
|
| 263 |
+
"y2": 685.0999908447266,
|
| 264 |
+
"width": 816,
|
| 265 |
+
"height": 1056,
|
| 266 |
+
"pageNumber": 3
|
| 267 |
+
},
|
| 268 |
+
{
|
| 269 |
+
"x1": 210.68333435058594,
|
| 270 |
+
"y1": 669.0999908447266,
|
| 271 |
+
"x2": 672.4833526611328,
|
| 272 |
+
"y2": 685.0999908447266,
|
| 273 |
+
"width": 816,
|
| 274 |
+
"height": 1056,
|
| 275 |
+
"pageNumber": 3
|
| 276 |
+
},
|
| 277 |
+
{
|
| 278 |
+
"x1": 144.01666259765625,
|
| 279 |
+
"y1": 683.566650390625,
|
| 280 |
+
"x2": 675.6000366210938,
|
| 281 |
+
"y2": 699.566650390625,
|
| 282 |
+
"width": 816,
|
| 283 |
+
"height": 1056,
|
| 284 |
+
"pageNumber": 3
|
| 285 |
+
},
|
| 286 |
+
{
|
| 287 |
+
"x1": 143.5333251953125,
|
| 288 |
+
"y1": 698.1333160400391,
|
| 289 |
+
"x2": 672.1999969482422,
|
| 290 |
+
"y2": 714.1333160400391,
|
| 291 |
+
"width": 816,
|
| 292 |
+
"height": 1056,
|
| 293 |
+
"pageNumber": 3
|
| 294 |
+
},
|
| 295 |
+
{
|
| 296 |
+
"x1": 144.01666259765625,
|
| 297 |
+
"y1": 712.7166748046875,
|
| 298 |
+
"x2": 672.5500030517578,
|
| 299 |
+
"y2": 728.7166748046875,
|
| 300 |
+
"width": 816,
|
| 301 |
+
"height": 1056,
|
| 302 |
+
"pageNumber": 3
|
| 303 |
+
},
|
| 304 |
+
{
|
| 305 |
+
"x1": 144.01666259765625,
|
| 306 |
+
"y1": 727.2833251953125,
|
| 307 |
+
"x2": 672.8166809082031,
|
| 308 |
+
"y2": 743.2833251953125,
|
| 309 |
+
"width": 816,
|
| 310 |
+
"height": 1056,
|
| 311 |
+
"pageNumber": 3
|
| 312 |
+
},
|
| 313 |
+
{
|
| 314 |
+
"x1": 144.01666259765625,
|
| 315 |
+
"y1": 741.75,
|
| 316 |
+
"x2": 673.3999633789062,
|
| 317 |
+
"y2": 757.75,
|
| 318 |
+
"width": 816,
|
| 319 |
+
"height": 1056,
|
| 320 |
+
"pageNumber": 3
|
| 321 |
+
},
|
| 322 |
+
{
|
| 323 |
+
"x1": 144.01666259765625,
|
| 324 |
+
"y1": 756.316650390625,
|
| 325 |
+
"x2": 350.9166717529297,
|
| 326 |
+
"y2": 772.316650390625,
|
| 327 |
+
"width": 816,
|
| 328 |
+
"height": 1056,
|
| 329 |
+
"pageNumber": 3
|
| 330 |
+
},
|
| 331 |
+
{
|
| 332 |
+
"x1": 378.4499969482422,
|
| 333 |
+
"y1": 756.316650390625,
|
| 334 |
+
"x2": 415.74998474121094,
|
| 335 |
+
"y2": 772.316650390625,
|
| 336 |
+
"width": 816,
|
| 337 |
+
"height": 1056,
|
| 338 |
+
"pageNumber": 3
|
| 339 |
+
},
|
| 340 |
+
{
|
| 341 |
+
"x1": 350.8666534423828,
|
| 342 |
+
"y1": 761.7833251953125,
|
| 343 |
+
"x2": 376.2333221435547,
|
| 344 |
+
"y2": 772.9833221435547,
|
| 345 |
+
"width": 816,
|
| 346 |
+
"height": 1056,
|
| 347 |
+
"pageNumber": 3
|
| 348 |
+
},
|
| 349 |
+
{
|
| 350 |
+
"x1": 144.01666259765625,
|
| 351 |
+
"y1": 787.1499938964844,
|
| 352 |
+
"x2": 197.60000610351562,
|
| 353 |
+
"y2": 803.1499938964844,
|
| 354 |
+
"width": 816,
|
| 355 |
+
"height": 1056,
|
| 356 |
+
"pageNumber": 3
|
| 357 |
+
},
|
| 358 |
+
{
|
| 359 |
+
"x1": 207.9166717529297,
|
| 360 |
+
"y1": 787.1499938964844,
|
| 361 |
+
"x2": 672.4166870117188,
|
| 362 |
+
"y2": 803.1499938964844,
|
| 363 |
+
"width": 816,
|
| 364 |
+
"height": 1056,
|
| 365 |
+
"pageNumber": 3
|
| 366 |
+
},
|
| 367 |
+
{
|
| 368 |
+
"x1": 144.01666259765625,
|
| 369 |
+
"y1": 801.7333374023438,
|
| 370 |
+
"x2": 673.3666381835938,
|
| 371 |
+
"y2": 817.7333374023438,
|
| 372 |
+
"width": 816,
|
| 373 |
+
"height": 1056,
|
| 374 |
+
"pageNumber": 3
|
| 375 |
+
},
|
| 376 |
+
{
|
| 377 |
+
"x1": 144.01666259765625,
|
| 378 |
+
"y1": 816.1999816894531,
|
| 379 |
+
"x2": 673.2666625976562,
|
| 380 |
+
"y2": 832.1999816894531,
|
| 381 |
+
"width": 816,
|
| 382 |
+
"height": 1056,
|
| 383 |
+
"pageNumber": 3
|
| 384 |
+
},
|
| 385 |
+
{
|
| 386 |
+
"x1": 144.01666259765625,
|
| 387 |
+
"y1": 830.7666625976562,
|
| 388 |
+
"x2": 673.3833618164062,
|
| 389 |
+
"y2": 846.7666625976562,
|
| 390 |
+
"width": 816,
|
| 391 |
+
"height": 1056,
|
| 392 |
+
"pageNumber": 3
|
| 393 |
+
},
|
| 394 |
+
{
|
| 395 |
+
"x1": 144.01666259765625,
|
| 396 |
+
"y1": 845.3499755859375,
|
| 397 |
+
"x2": 673.2833251953125,
|
| 398 |
+
"y2": 861.3499755859375,
|
| 399 |
+
"width": 816,
|
| 400 |
+
"height": 1056,
|
| 401 |
+
"pageNumber": 3
|
| 402 |
+
},
|
| 403 |
+
{
|
| 404 |
+
"x1": 144.01666259765625,
|
| 405 |
+
"y1": 859.9166564941406,
|
| 406 |
+
"x2": 673.3666381835938,
|
| 407 |
+
"y2": 875.9166564941406,
|
| 408 |
+
"width": 816,
|
| 409 |
+
"height": 1056,
|
| 410 |
+
"pageNumber": 3
|
| 411 |
+
},
|
| 412 |
+
{
|
| 413 |
+
"x1": 144.01666259765625,
|
| 414 |
+
"y1": 874.38330078125,
|
| 415 |
+
"x2": 608.8999938964844,
|
| 416 |
+
"y2": 890.38330078125,
|
| 417 |
+
"width": 816,
|
| 418 |
+
"height": 1056,
|
| 419 |
+
"pageNumber": 3
|
| 420 |
+
}
|
| 421 |
+
]
|
| 422 |
+
},
|
| 423 |
+
"content": {
|
| 424 |
+
"text": "The Transformer follows this overall architecture using stacked self-attention and point-wise, fully\r connected layers for both the encoder and decoder, shown in the left and right halves of Figure 1,\r respectively.\r 3.1 Encoder and Decoder Stacks\r Encoder: The encoder is composed of a stack of N = 6 identical layers. Each layer has two\r sub-layers. The first is a multi-head self-attention mechanism, and the second is a simple, position-\r wise fully connected feed-forward network. We employ a residual connection [ 11 ] around each of\r the two sub-layers, followed by layer normalization [1]. That is, the output of each sub-layer is\r LayerNorm(x + Sublayer(x)), where Sublayer(x) is the function implemented by the sub-layer\r itself. To facilitate these residual connections, all sub-layers in the model, as well as the embedding\r layers, produce outputs of dimension dmodel = 512.\r Decoder: The decoder is also composed of a stack of N = 6 identical layers. In addition to the two\r sub-layers in each encoder layer, the decoder inserts a third sub-layer, which performs multi-head\r attention over the output of the encoder stack. Similar to the encoder, we employ residual connections\r around each of the sub-layers, followed by layer normalization. We also modify the self-attention\r sub-layer in the decoder stack to prevent positions from attending to subsequent positions. This\r masking, combined with fact that the output embeddings are offset by one position, ensures that the\r predictions for position i can depend only on the known outputs at positions less than i."
|
| 425 |
+
}
|
| 426 |
+
}],
|
| 427 |
+
"2": [{
|
| 428 |
+
"id": "highlight_1755777477064",
|
| 429 |
+
"position": {
|
| 430 |
+
"boundingRect": {
|
| 431 |
+
"x1": 143.53334045410156,
|
| 432 |
+
"y1": 933.2333068847656,
|
| 433 |
+
"x2": 674.8833160400391,
|
| 434 |
+
"y2": 964.6999816894531,
|
| 435 |
+
"width": 816,
|
| 436 |
+
"height": 1056,
|
| 437 |
+
"pageNumber": 3
|
| 438 |
+
},
|
| 439 |
+
"rects": [
|
| 440 |
+
{
|
| 441 |
+
"x1": 143.53334045410156,
|
| 442 |
+
"y1": 933.2333068847656,
|
| 443 |
+
"x2": 674.8833160400391,
|
| 444 |
+
"y2": 950.2333068847656,
|
| 445 |
+
"width": 816,
|
| 446 |
+
"height": 1056,
|
| 447 |
+
"pageNumber": 3
|
| 448 |
+
},
|
| 449 |
+
{
|
| 450 |
+
"x1": 143.53334045410156,
|
| 451 |
+
"y1": 947.6999816894531,
|
| 452 |
+
"x2": 673.2666778564453,
|
| 453 |
+
"y2": 964.6999816894531,
|
| 454 |
+
"width": 816,
|
| 455 |
+
"height": 1056,
|
| 456 |
+
"pageNumber": 3
|
| 457 |
+
}
|
| 458 |
+
]
|
| 459 |
+
},
|
| 460 |
+
"content": {
|
| 461 |
+
"text": "An attention function can be described as mapping a query and a set of key-value pairs to an output,\r where the query, keys, values, and output are all vectors. The output is computed as a weighted sum"
|
| 462 |
+
}
|
| 463 |
+
}, {
|
| 464 |
+
"id": "highlight_1755777635270",
|
| 465 |
+
"position": {
|
| 466 |
+
"boundingRect": {
|
| 467 |
+
"x1": 143.36666870117188,
|
| 468 |
+
"y1": 420.75,
|
| 469 |
+
"x2": 674.5166625976562,
|
| 470 |
+
"y2": 825.8000183105469,
|
| 471 |
+
"width": 816,
|
| 472 |
+
"height": 1056,
|
| 473 |
+
"pageNumber": 4
|
| 474 |
+
},
|
| 475 |
+
"rects": [
|
| 476 |
+
{
|
| 477 |
+
"x1": 144.0166778564453,
|
| 478 |
+
"y1": 420.75,
|
| 479 |
+
"x2": 673.2666778564453,
|
| 480 |
+
"y2": 437.75,
|
| 481 |
+
"width": 816,
|
| 482 |
+
"height": 1056,
|
| 483 |
+
"pageNumber": 4
|
| 484 |
+
},
|
| 485 |
+
{
|
| 486 |
+
"x1": 144.0166778564453,
|
| 487 |
+
"y1": 435.2166748046875,
|
| 488 |
+
"x2": 325.1333312988281,
|
| 489 |
+
"y2": 452.2166748046875,
|
| 490 |
+
"width": 816,
|
| 491 |
+
"height": 1056,
|
| 492 |
+
"pageNumber": 4
|
| 493 |
+
},
|
| 494 |
+
{
|
| 495 |
+
"x1": 144.0166778564453,
|
| 496 |
+
"y1": 465.95001220703125,
|
| 497 |
+
"x2": 173.53334045410156,
|
| 498 |
+
"y2": 482.95001220703125,
|
| 499 |
+
"width": 816,
|
| 500 |
+
"height": 1056,
|
| 501 |
+
"pageNumber": 4
|
| 502 |
+
},
|
| 503 |
+
{
|
| 504 |
+
"x1": 183.83334350585938,
|
| 505 |
+
"y1": 465.95001220703125,
|
| 506 |
+
"x2": 352.2166748046875,
|
| 507 |
+
"y2": 482.95001220703125,
|
| 508 |
+
"width": 816,
|
| 509 |
+
"height": 1056,
|
| 510 |
+
"pageNumber": 4
|
| 511 |
+
},
|
| 512 |
+
{
|
| 513 |
+
"x1": 143.36666870117188,
|
| 514 |
+
"y1": 490.8666687011719,
|
| 515 |
+
"x2": 673.2166442871094,
|
| 516 |
+
"y2": 507.8666687011719,
|
| 517 |
+
"width": 816,
|
| 518 |
+
"height": 1056,
|
| 519 |
+
"pageNumber": 4
|
| 520 |
+
},
|
| 521 |
+
{
|
| 522 |
+
"x1": 144.0166778564453,
|
| 523 |
+
"y1": 505.3333435058594,
|
| 524 |
+
"x2": 672.4166870117188,
|
| 525 |
+
"y2": 522.3333435058594,
|
| 526 |
+
"width": 816,
|
| 527 |
+
"height": 1056,
|
| 528 |
+
"pageNumber": 4
|
| 529 |
+
},
|
| 530 |
+
{
|
| 531 |
+
"x1": 333.41668701171875,
|
| 532 |
+
"y1": 511.4166717529297,
|
| 533 |
+
"x2": 340.0333557128906,
|
| 534 |
+
"y2": 524.7000122070312,
|
| 535 |
+
"width": 816,
|
| 536 |
+
"height": 1056,
|
| 537 |
+
"pageNumber": 4
|
| 538 |
+
},
|
| 539 |
+
{
|
| 540 |
+
"x1": 315.7833251953125,
|
| 541 |
+
"y1": 511.5166778564453,
|
| 542 |
+
"x2": 320.01666259765625,
|
| 543 |
+
"y2": 522.5166778564453,
|
| 544 |
+
"width": 816,
|
| 545 |
+
"height": 1056,
|
| 546 |
+
"pageNumber": 4
|
| 547 |
+
},
|
| 548 |
+
{
|
| 549 |
+
"x1": 468.61663818359375,
|
| 550 |
+
"y1": 511.5166778564453,
|
| 551 |
+
"x2": 476.0166931152344,
|
| 552 |
+
"y2": 522.5166778564453,
|
| 553 |
+
"width": 816,
|
| 554 |
+
"height": 1056,
|
| 555 |
+
"pageNumber": 4
|
| 556 |
+
},
|
| 557 |
+
{
|
| 558 |
+
"x1": 144.0166778564453,
|
| 559 |
+
"y1": 519.9166717529297,
|
| 560 |
+
"x2": 333.066650390625,
|
| 561 |
+
"y2": 536.9166717529297,
|
| 562 |
+
"width": 816,
|
| 563 |
+
"height": 1056,
|
| 564 |
+
"pageNumber": 4
|
| 565 |
+
},
|
| 566 |
+
{
|
| 567 |
+
"x1": 344.433349609375,
|
| 568 |
+
"y1": 519.9166717529297,
|
| 569 |
+
"x2": 672.7166748046875,
|
| 570 |
+
"y2": 536.9166717529297,
|
| 571 |
+
"width": 816,
|
| 572 |
+
"height": 1056,
|
| 573 |
+
"pageNumber": 4
|
| 574 |
+
},
|
| 575 |
+
{
|
| 576 |
+
"x1": 351.3666687011719,
|
| 577 |
+
"y1": 526.0833435058594,
|
| 578 |
+
"x2": 355.6000061035156,
|
| 579 |
+
"y2": 537.0833435058594,
|
| 580 |
+
"width": 816,
|
| 581 |
+
"height": 1056,
|
| 582 |
+
"pageNumber": 4
|
| 583 |
+
},
|
| 584 |
+
{
|
| 585 |
+
"x1": 143.68333435058594,
|
| 586 |
+
"y1": 534.4833374023438,
|
| 587 |
+
"x2": 180.7166748046875,
|
| 588 |
+
"y2": 551.4833374023438,
|
| 589 |
+
"width": 816,
|
| 590 |
+
"height": 1056,
|
| 591 |
+
"pageNumber": 4
|
| 592 |
+
},
|
| 593 |
+
{
|
| 594 |
+
"x1": 144.0166778564453,
|
| 595 |
+
"y1": 556.3500061035156,
|
| 596 |
+
"x2": 673.5166778564453,
|
| 597 |
+
"y2": 573.3500061035156,
|
| 598 |
+
"width": 816,
|
| 599 |
+
"height": 1056,
|
| 600 |
+
"pageNumber": 4
|
| 601 |
+
},
|
| 602 |
+
{
|
| 603 |
+
"x1": 144.0166778564453,
|
| 604 |
+
"y1": 570.9166564941406,
|
| 605 |
+
"x2": 672.0999755859375,
|
| 606 |
+
"y2": 587.9166564941406,
|
| 607 |
+
"width": 816,
|
| 608 |
+
"height": 1056,
|
| 609 |
+
"pageNumber": 4
|
| 610 |
+
},
|
| 611 |
+
{
|
| 612 |
+
"x1": 144.0166778564453,
|
| 613 |
+
"y1": 585.3833312988281,
|
| 614 |
+
"x2": 273.48333740234375,
|
| 615 |
+
"y2": 602.3833312988281,
|
| 616 |
+
"width": 816,
|
| 617 |
+
"height": 1056,
|
| 618 |
+
"pageNumber": 4
|
| 619 |
+
},
|
| 620 |
+
{
|
| 621 |
+
"x1": 496.85003662109375,
|
| 622 |
+
"y1": 619.1166687011719,
|
| 623 |
+
"x2": 501.38336181640625,
|
| 624 |
+
"y2": 630.1166687011719,
|
| 625 |
+
"width": 816,
|
| 626 |
+
"height": 1056,
|
| 627 |
+
"pageNumber": 4
|
| 628 |
+
},
|
| 629 |
+
{
|
| 630 |
+
"x1": 474.16668701171875,
|
| 631 |
+
"y1": 619.8166809082031,
|
| 632 |
+
"x2": 496.00001525878906,
|
| 633 |
+
"y2": 636.8166809082031,
|
| 634 |
+
"width": 816,
|
| 635 |
+
"height": 1056,
|
| 636 |
+
"pageNumber": 4
|
| 637 |
+
},
|
| 638 |
+
{
|
| 639 |
+
"x1": 293.26666259765625,
|
| 640 |
+
"y1": 628.7833557128906,
|
| 641 |
+
"x2": 484.0500183105469,
|
| 642 |
+
"y2": 645.7833557128906,
|
| 643 |
+
"width": 816,
|
| 644 |
+
"height": 1056,
|
| 645 |
+
"pageNumber": 4
|
| 646 |
+
},
|
| 647 |
+
{
|
| 648 |
+
"x1": 506.816650390625,
|
| 649 |
+
"y1": 628.7833557128906,
|
| 650 |
+
"x2": 522.7833251953125,
|
| 651 |
+
"y2": 645.7833557128906,
|
| 652 |
+
"width": 816,
|
| 653 |
+
"height": 1056,
|
| 654 |
+
"pageNumber": 4
|
| 655 |
+
},
|
| 656 |
+
{
|
| 657 |
+
"x1": 657.3666381835938,
|
| 658 |
+
"y1": 628.7833557128906,
|
| 659 |
+
"x2": 672.8999786376953,
|
| 660 |
+
"y2": 645.7833557128906,
|
| 661 |
+
"width": 816,
|
| 662 |
+
"height": 1056,
|
| 663 |
+
"pageNumber": 4
|
| 664 |
+
},
|
| 665 |
+
{
|
| 666 |
+
"x1": 488.5333251953125,
|
| 667 |
+
"y1": 638.7166748046875,
|
| 668 |
+
"x2": 495.51666259765625,
|
| 669 |
+
"y2": 655.7166748046875,
|
| 670 |
+
"width": 816,
|
| 671 |
+
"height": 1056,
|
| 672 |
+
"pageNumber": 4
|
| 673 |
+
},
|
| 674 |
+
{
|
| 675 |
+
"x1": 495.38336181640625,
|
| 676 |
+
"y1": 644.7833557128906,
|
| 677 |
+
"x2": 499.61669921875,
|
| 678 |
+
"y2": 655.7833557128906,
|
| 679 |
+
"width": 816,
|
| 680 |
+
"height": 1056,
|
| 681 |
+
"pageNumber": 4
|
| 682 |
+
},
|
| 683 |
+
{
|
| 684 |
+
"x1": 143.60000610351562,
|
| 685 |
+
"y1": 663.8500061035156,
|
| 686 |
+
"x2": 674.5166625976562,
|
| 687 |
+
"y2": 680.8500061035156,
|
| 688 |
+
"width": 816,
|
| 689 |
+
"height": 1056,
|
| 690 |
+
"pageNumber": 4
|
| 691 |
+
},
|
| 692 |
+
{
|
| 693 |
+
"x1": 144.0166778564453,
|
| 694 |
+
"y1": 678.4166564941406,
|
| 695 |
+
"x2": 673.5166778564453,
|
| 696 |
+
"y2": 695.4166564941406,
|
| 697 |
+
"width": 816,
|
| 698 |
+
"height": 1056,
|
| 699 |
+
"pageNumber": 4
|
| 700 |
+
},
|
| 701 |
+
{
|
| 702 |
+
"x1": 144.0166778564453,
|
| 703 |
+
"y1": 692.8833312988281,
|
| 704 |
+
"x2": 158.1999969482422,
|
| 705 |
+
"y2": 709.8833312988281,
|
| 706 |
+
"width": 816,
|
| 707 |
+
"height": 1056,
|
| 708 |
+
"pageNumber": 4
|
| 709 |
+
},
|
| 710 |
+
{
|
| 711 |
+
"x1": 181.7166748046875,
|
| 712 |
+
"y1": 692.8833312988281,
|
| 713 |
+
"x2": 673.1666870117188,
|
| 714 |
+
"y2": 709.8833312988281,
|
| 715 |
+
"width": 816,
|
| 716 |
+
"height": 1056,
|
| 717 |
+
"pageNumber": 4
|
| 718 |
+
},
|
| 719 |
+
{
|
| 720 |
+
"x1": 160.08334350585938,
|
| 721 |
+
"y1": 695.4666748046875,
|
| 722 |
+
"x2": 673.1666870117188,
|
| 723 |
+
"y2": 706.4666748046875,
|
| 724 |
+
"width": 816,
|
| 725 |
+
"height": 1056,
|
| 726 |
+
"pageNumber": 4
|
| 727 |
+
},
|
| 728 |
+
{
|
| 729 |
+
"x1": 168.81666564941406,
|
| 730 |
+
"y1": 702.6499938964844,
|
| 731 |
+
"x2": 177.3000030517578,
|
| 732 |
+
"y2": 713.6499938964844,
|
| 733 |
+
"width": 816,
|
| 734 |
+
"height": 1056,
|
| 735 |
+
"pageNumber": 4
|
| 736 |
+
},
|
| 737 |
+
{
|
| 738 |
+
"x1": 144.0166778564453,
|
| 739 |
+
"y1": 710.5166931152344,
|
| 740 |
+
"x2": 673.3499908447266,
|
| 741 |
+
"y2": 727.5166931152344,
|
| 742 |
+
"width": 816,
|
| 743 |
+
"height": 1056,
|
| 744 |
+
"pageNumber": 4
|
| 745 |
+
},
|
| 746 |
+
{
|
| 747 |
+
"x1": 144.0166778564453,
|
| 748 |
+
"y1": 724.9833374023438,
|
| 749 |
+
"x2": 673.3833160400391,
|
| 750 |
+
"y2": 741.9833374023438,
|
| 751 |
+
"width": 816,
|
| 752 |
+
"height": 1056,
|
| 753 |
+
"pageNumber": 4
|
| 754 |
+
},
|
| 755 |
+
{
|
| 756 |
+
"x1": 144.0166778564453,
|
| 757 |
+
"y1": 739.5666809082031,
|
| 758 |
+
"x2": 288.65000915527344,
|
| 759 |
+
"y2": 756.5666809082031,
|
| 760 |
+
"width": 816,
|
| 761 |
+
"height": 1056,
|
| 762 |
+
"pageNumber": 4
|
| 763 |
+
},
|
| 764 |
+
{
|
| 765 |
+
"x1": 143.36666870117188,
|
| 766 |
+
"y1": 761.4166564941406,
|
| 767 |
+
"x2": 672.9666748046875,
|
| 768 |
+
"y2": 778.4166564941406,
|
| 769 |
+
"width": 816,
|
| 770 |
+
"height": 1056,
|
| 771 |
+
"pageNumber": 4
|
| 772 |
+
},
|
| 773 |
+
{
|
| 774 |
+
"x1": 289.01666259765625,
|
| 775 |
+
"y1": 767.4833374023438,
|
| 776 |
+
"x2": 296.75001525878906,
|
| 777 |
+
"y2": 778.4833374023438,
|
| 778 |
+
"width": 816,
|
| 779 |
+
"height": 1056,
|
| 780 |
+
"pageNumber": 4
|
| 781 |
+
},
|
| 782 |
+
{
|
| 783 |
+
"x1": 144.0166778564453,
|
| 784 |
+
"y1": 775.9833374023438,
|
| 785 |
+
"x2": 672.4499664306641,
|
| 786 |
+
"y2": 792.9833374023438,
|
| 787 |
+
"width": 816,
|
| 788 |
+
"height": 1056,
|
| 789 |
+
"pageNumber": 4
|
| 790 |
+
},
|
| 791 |
+
{
|
| 792 |
+
"x1": 455.4000244140625,
|
| 793 |
+
"y1": 782.0666809082031,
|
| 794 |
+
"x2": 463.13331604003906,
|
| 795 |
+
"y2": 793.0666809082031,
|
| 796 |
+
"width": 816,
|
| 797 |
+
"height": 1056,
|
| 798 |
+
"pageNumber": 4
|
| 799 |
+
},
|
| 800 |
+
{
|
| 801 |
+
"x1": 144.0166778564453,
|
| 802 |
+
"y1": 790.5666809082031,
|
| 803 |
+
"x2": 673.3500366210938,
|
| 804 |
+
"y2": 807.5666809082031,
|
| 805 |
+
"width": 816,
|
| 806 |
+
"height": 1056,
|
| 807 |
+
"pageNumber": 4
|
| 808 |
+
},
|
| 809 |
+
{
|
| 810 |
+
"x1": 150.86666870117188,
|
| 811 |
+
"y1": 796.6333312988281,
|
| 812 |
+
"x2": 155.10000610351562,
|
| 813 |
+
"y2": 807.6333312988281,
|
| 814 |
+
"width": 816,
|
| 815 |
+
"height": 1056,
|
| 816 |
+
"pageNumber": 4
|
| 817 |
+
},
|
| 818 |
+
{
|
| 819 |
+
"x1": 595.0166625976562,
|
| 820 |
+
"y1": 803.9166564941406,
|
| 821 |
+
"x2": 599.7333221435547,
|
| 822 |
+
"y2": 814.9166564941406,
|
| 823 |
+
"width": 816,
|
| 824 |
+
"height": 1056,
|
| 825 |
+
"pageNumber": 4
|
| 826 |
+
},
|
| 827 |
+
{
|
| 828 |
+
"x1": 144.0166778564453,
|
| 829 |
+
"y1": 805.0333557128906,
|
| 830 |
+
"x2": 592.2333679199219,
|
| 831 |
+
"y2": 822.0333557128906,
|
| 832 |
+
"width": 816,
|
| 833 |
+
"height": 1056,
|
| 834 |
+
"pageNumber": 4
|
| 835 |
+
},
|
| 836 |
+
{
|
| 837 |
+
"x1": 596.316650390625,
|
| 838 |
+
"y1": 814.8000183105469,
|
| 839 |
+
"x2": 604.88330078125,
|
| 840 |
+
"y2": 825.8000183105469,
|
| 841 |
+
"width": 816,
|
| 842 |
+
"height": 1056,
|
| 843 |
+
"pageNumber": 4
|
| 844 |
+
}
|
| 845 |
+
]
|
| 846 |
+
},
|
| 847 |
+
"content": {
|
| 848 |
+
"text": "of the values, where the weight assigned to each value is computed by a compatibility function of the\r query with the corresponding key.\r 3.2.1 Scaled Dot-Product Attention\r We call our particular attention \"Scaled Dot-Product Attention\" (Figure 2). The input consists of\r queries and keys of dimension dk, and values of dimension dv . We compute the dot products of the\r query with all keys, divide each by √dk, and apply a softmax function to obtain the weights on the\r values.\r In practice, we compute the attention function on a set of queries simultaneously, packed together\r into a matrix Q. The keys and values are also packed together into matrices K and V . We compute\r the matrix of outputs as:\r Attention(Q, K, V ) = softmax( QKT\r √dk\r )V (1)\r The two most commonly used attention functions are additive attention [ 2], and dot-product (multi-\r plicative) attention. Dot-product attention is identical to our algorithm, except for the scaling factor\r of 1√dk\r . Additive attention computes the compatibility function using a feed-forward network with\r a single hidden layer. While the two are similar in theoretical complexity, dot-product attention is\r much faster and more space-efficient in practice, since it can be implemented using highly optimized\r matrix multiplication code.\r While for small values of dk the two mechanisms perform similarly, additive attention outperforms\r dot product attention without scaling for larger values of dk [3 ]. We suspect that for large values of\r dk, the dot products grow large in magnitude, pushing the softmax function into regions where it has\r extremely small gradients 4. To counteract this effect, we scale the dot products by 1√dk\r "
|
| 849 |
+
}
|
| 850 |
+
}, {
|
| 851 |
+
"id": "highlight_1755777652333",
|
| 852 |
+
"position": {
|
| 853 |
+
"boundingRect": {
|
| 854 |
+
"x1": 143.68333435058594,
|
| 855 |
+
"y1": 932.4833068847656,
|
| 856 |
+
"x2": 671.88330078125,
|
| 857 |
+
"y2": 966.3499755859375,
|
| 858 |
+
"width": 816,
|
| 859 |
+
"height": 1056,
|
| 860 |
+
"pageNumber": 4
|
| 861 |
+
},
|
| 862 |
+
"rects": [
|
| 863 |
+
{
|
| 864 |
+
"x1": 160.83334350585938,
|
| 865 |
+
"y1": 932.4833068847656,
|
| 866 |
+
"x2": 671.7000274658203,
|
| 867 |
+
"y2": 946.4833068847656,
|
| 868 |
+
"width": 816,
|
| 869 |
+
"height": 1056,
|
| 870 |
+
"pageNumber": 4
|
| 871 |
+
},
|
| 872 |
+
{
|
| 873 |
+
"x1": 478.5,
|
| 874 |
+
"y1": 941.3999633789062,
|
| 875 |
+
"x2": 485.7166748046875,
|
| 876 |
+
"y2": 953.3999633789062,
|
| 877 |
+
"width": 816,
|
| 878 |
+
"height": 1056,
|
| 879 |
+
"pageNumber": 4
|
| 880 |
+
},
|
| 881 |
+
{
|
| 882 |
+
"x1": 491.4666748046875,
|
| 883 |
+
"y1": 947.0499877929688,
|
| 884 |
+
"x2": 499.54998779296875,
|
| 885 |
+
"y2": 957.0499877929688,
|
| 886 |
+
"width": 816,
|
| 887 |
+
"height": 1056,
|
| 888 |
+
"pageNumber": 4
|
| 889 |
+
},
|
| 890 |
+
{
|
| 891 |
+
"x1": 143.68333435058594,
|
| 892 |
+
"y1": 949.5833129882812,
|
| 893 |
+
"x2": 477.76666259765625,
|
| 894 |
+
"y2": 963.5833129882812,
|
| 895 |
+
"width": 816,
|
| 896 |
+
"height": 1056,
|
| 897 |
+
"pageNumber": 4
|
| 898 |
+
},
|
| 899 |
+
{
|
| 900 |
+
"x1": 510,
|
| 901 |
+
"y1": 949.5833129882812,
|
| 902 |
+
"x2": 671.88330078125,
|
| 903 |
+
"y2": 963.5833129882812,
|
| 904 |
+
"width": 816,
|
| 905 |
+
"height": 1056,
|
| 906 |
+
"pageNumber": 4
|
| 907 |
+
},
|
| 908 |
+
{
|
| 909 |
+
"x1": 662.75,
|
| 910 |
+
"y1": 955.1499633789062,
|
| 911 |
+
"x2": 669.8499908447266,
|
| 912 |
+
"y2": 963.1166229248047,
|
| 913 |
+
"width": 816,
|
| 914 |
+
"height": 1056,
|
| 915 |
+
"pageNumber": 4
|
| 916 |
+
},
|
| 917 |
+
{
|
| 918 |
+
"x1": 491.4666748046875,
|
| 919 |
+
"y1": 956.3499755859375,
|
| 920 |
+
"x2": 527.9833526611328,
|
| 921 |
+
"y2": 966.3499755859375,
|
| 922 |
+
"width": 816,
|
| 923 |
+
"height": 1056,
|
| 924 |
+
"pageNumber": 4
|
| 925 |
+
}
|
| 926 |
+
]
|
| 927 |
+
},
|
| 928 |
+
"content": {
|
| 929 |
+
"text": "4To illustrate why the dot products get large, assume that the components of q and k are independent random\r variables with mean 0 and variance 1. Then their dot product, q · k = Pdk\r i=1 qiki, has mean 0 and variance dk ."
|
| 930 |
+
}
|
| 931 |
+
}],
|
| 932 |
+
"3": [{
|
| 933 |
+
"id": "highlight_1755776743896",
|
| 934 |
+
"position": {
|
| 935 |
+
"boundingRect": {
|
| 936 |
+
"x1": 143.53334045410156,
|
| 937 |
+
"y1": 863.8500061035156,
|
| 938 |
+
"x2": 674.2500152587891,
|
| 939 |
+
"y2": 924.5333404541016,
|
| 940 |
+
"width": 816,
|
| 941 |
+
"height": 1056,
|
| 942 |
+
"pageNumber": 4
|
| 943 |
+
},
|
| 944 |
+
"rects": [
|
| 945 |
+
{
|
| 946 |
+
"x1": 144.0166778564453,
|
| 947 |
+
"y1": 863.8500061035156,
|
| 948 |
+
"x2": 442.54998779296875,
|
| 949 |
+
"y2": 880.8500061035156,
|
| 950 |
+
"width": 816,
|
| 951 |
+
"height": 1056,
|
| 952 |
+
"pageNumber": 4
|
| 953 |
+
},
|
| 954 |
+
{
|
| 955 |
+
"x1": 466.41668701171875,
|
| 956 |
+
"y1": 863.8500061035156,
|
| 957 |
+
"x2": 674.2500152587891,
|
| 958 |
+
"y2": 880.8500061035156,
|
| 959 |
+
"width": 816,
|
| 960 |
+
"height": 1056,
|
| 961 |
+
"pageNumber": 4
|
| 962 |
+
},
|
| 963 |
+
{
|
| 964 |
+
"x1": 442.433349609375,
|
| 965 |
+
"y1": 869.9166717529297,
|
| 966 |
+
"x2": 465.8166809082031,
|
| 967 |
+
"y2": 880.9166717529297,
|
| 968 |
+
"width": 816,
|
| 969 |
+
"height": 1056,
|
| 970 |
+
"pageNumber": 4
|
| 971 |
+
},
|
| 972 |
+
{
|
| 973 |
+
"x1": 143.53334045410156,
|
| 974 |
+
"y1": 878.3166809082031,
|
| 975 |
+
"x2": 672.4166717529297,
|
| 976 |
+
"y2": 895.3166809082031,
|
| 977 |
+
"width": 816,
|
| 978 |
+
"height": 1056,
|
| 979 |
+
"pageNumber": 4
|
| 980 |
+
},
|
| 981 |
+
{
|
| 982 |
+
"x1": 144.0166778564453,
|
| 983 |
+
"y1": 892.8833465576172,
|
| 984 |
+
"x2": 672.8666687011719,
|
| 985 |
+
"y2": 909.8833465576172,
|
| 986 |
+
"width": 816,
|
| 987 |
+
"height": 1056,
|
| 988 |
+
"pageNumber": 4
|
| 989 |
+
},
|
| 990 |
+
{
|
| 991 |
+
"x1": 260.8666687011719,
|
| 992 |
+
"y1": 899.0666809082031,
|
| 993 |
+
"x2": 265.1000061035156,
|
| 994 |
+
"y2": 910.0666809082031,
|
| 995 |
+
"width": 816,
|
| 996 |
+
"height": 1056,
|
| 997 |
+
"pageNumber": 4
|
| 998 |
+
},
|
| 999 |
+
{
|
| 1000 |
+
"x1": 280.933349609375,
|
| 1001 |
+
"y1": 899.0666809082031,
|
| 1002 |
+
"x2": 288.75001525878906,
|
| 1003 |
+
"y2": 910.0666809082031,
|
| 1004 |
+
"width": 816,
|
| 1005 |
+
"height": 1056,
|
| 1006 |
+
"pageNumber": 4
|
| 1007 |
+
},
|
| 1008 |
+
{
|
| 1009 |
+
"x1": 320.3500061035156,
|
| 1010 |
+
"y1": 899.0666809082031,
|
| 1011 |
+
"x2": 327.75001525878906,
|
| 1012 |
+
"y2": 910.0666809082031,
|
| 1013 |
+
"width": 816,
|
| 1014 |
+
"height": 1056,
|
| 1015 |
+
"pageNumber": 4
|
| 1016 |
+
},
|
| 1017 |
+
{
|
| 1018 |
+
"x1": 144.0166778564453,
|
| 1019 |
+
"y1": 907.4666748046875,
|
| 1020 |
+
"x2": 672.2167053222656,
|
| 1021 |
+
"y2": 924.4666748046875,
|
| 1022 |
+
"width": 816,
|
| 1023 |
+
"height": 1056,
|
| 1024 |
+
"pageNumber": 4
|
| 1025 |
+
},
|
| 1026 |
+
{
|
| 1027 |
+
"x1": 596.4833374023438,
|
| 1028 |
+
"y1": 913.5333404541016,
|
| 1029 |
+
"x2": 603.9666442871094,
|
| 1030 |
+
"y2": 924.5333404541016,
|
| 1031 |
+
"width": 816,
|
| 1032 |
+
"height": 1056,
|
| 1033 |
+
"pageNumber": 4
|
| 1034 |
+
}
|
| 1035 |
+
]
|
| 1036 |
+
},
|
| 1037 |
+
"content": {
|
| 1038 |
+
"text": "Instead of performing a single attention function with dmodel-dimensional keys, values and queries,\r we found it beneficial to linearly project the queries, keys and values h times with different, learned\r linear projections to dk, dk and dv dimensions, respectively. On each of these projected versions of\r queries, keys and values we then perform the attention function in parallel, yielding dv -dimensional"
|
| 1039 |
+
}
|
| 1040 |
+
}, {
|
| 1041 |
+
"id": "highlight_1755776791875",
|
| 1042 |
+
"position": {
|
| 1043 |
+
"boundingRect": {
|
| 1044 |
+
"x1": 143.36666870117188,
|
| 1045 |
+
"y1": 96.98333740234375,
|
| 1046 |
+
"x2": 673.2666778564453,
|
| 1047 |
+
"y2": 356.43333435058594,
|
| 1048 |
+
"width": 816,
|
| 1049 |
+
"height": 1056,
|
| 1050 |
+
"pageNumber": 5
|
| 1051 |
+
},
|
| 1052 |
+
"rects": [
|
| 1053 |
+
{
|
| 1054 |
+
"x1": 144.0166778564453,
|
| 1055 |
+
"y1": 96.98333740234375,
|
| 1056 |
+
"x2": 673.2666778564453,
|
| 1057 |
+
"y2": 113.98333740234375,
|
| 1058 |
+
"width": 816,
|
| 1059 |
+
"height": 1056,
|
| 1060 |
+
"pageNumber": 5
|
| 1061 |
+
},
|
| 1062 |
+
{
|
| 1063 |
+
"x1": 144.0166778564453,
|
| 1064 |
+
"y1": 111.55000305175781,
|
| 1065 |
+
"x2": 254.2166748046875,
|
| 1066 |
+
"y2": 128.5500030517578,
|
| 1067 |
+
"width": 816,
|
| 1068 |
+
"height": 1056,
|
| 1069 |
+
"pageNumber": 5
|
| 1070 |
+
},
|
| 1071 |
+
{
|
| 1072 |
+
"x1": 144.0166778564453,
|
| 1073 |
+
"y1": 133.41665649414062,
|
| 1074 |
+
"x2": 673.2666778564453,
|
| 1075 |
+
"y2": 150.41665649414062,
|
| 1076 |
+
"width": 816,
|
| 1077 |
+
"height": 1056,
|
| 1078 |
+
"pageNumber": 5
|
| 1079 |
+
},
|
| 1080 |
+
{
|
| 1081 |
+
"x1": 144.0166778564453,
|
| 1082 |
+
"y1": 147.98333740234375,
|
| 1083 |
+
"x2": 594.7833404541016,
|
| 1084 |
+
"y2": 164.98333740234375,
|
| 1085 |
+
"width": 816,
|
| 1086 |
+
"height": 1056,
|
| 1087 |
+
"pageNumber": 5
|
| 1088 |
+
},
|
| 1089 |
+
{
|
| 1090 |
+
"x1": 539.7833251953125,
|
| 1091 |
+
"y1": 193.5500030517578,
|
| 1092 |
+
"x2": 545.9333190917969,
|
| 1093 |
+
"y2": 204.5500030517578,
|
| 1094 |
+
"width": 816,
|
| 1095 |
+
"height": 1056,
|
| 1096 |
+
"pageNumber": 5
|
| 1097 |
+
},
|
| 1098 |
+
{
|
| 1099 |
+
"x1": 249.2833251953125,
|
| 1100 |
+
"y1": 194.86666870117188,
|
| 1101 |
+
"x2": 540.9833374023438,
|
| 1102 |
+
"y2": 211.86666870117188,
|
| 1103 |
+
"width": 816,
|
| 1104 |
+
"height": 1056,
|
| 1105 |
+
"pageNumber": 5
|
| 1106 |
+
},
|
| 1107 |
+
{
|
| 1108 |
+
"x1": 457.60003662109375,
|
| 1109 |
+
"y1": 201.0500030517578,
|
| 1110 |
+
"x2": 462.3166961669922,
|
| 1111 |
+
"y2": 212.0500030517578,
|
| 1112 |
+
"width": 816,
|
| 1113 |
+
"height": 1056,
|
| 1114 |
+
"pageNumber": 5
|
| 1115 |
+
},
|
| 1116 |
+
{
|
| 1117 |
+
"x1": 513.75,
|
| 1118 |
+
"y1": 201.0500030517578,
|
| 1119 |
+
"x2": 518.6333312988281,
|
| 1120 |
+
"y2": 212.0500030517578,
|
| 1121 |
+
"width": 816,
|
| 1122 |
+
"height": 1056,
|
| 1123 |
+
"pageNumber": 5
|
| 1124 |
+
},
|
| 1125 |
+
{
|
| 1126 |
+
"x1": 299.29998779296875,
|
| 1127 |
+
"y1": 217.36666870117188,
|
| 1128 |
+
"x2": 565.5999908447266,
|
| 1129 |
+
"y2": 234.36666870117188,
|
| 1130 |
+
"width": 816,
|
| 1131 |
+
"height": 1056,
|
| 1132 |
+
"pageNumber": 5
|
| 1133 |
+
},
|
| 1134 |
+
{
|
| 1135 |
+
"x1": 362.3833312988281,
|
| 1136 |
+
"y1": 223.43333435058594,
|
| 1137 |
+
"x2": 367.50001525878906,
|
| 1138 |
+
"y2": 234.43333435058594,
|
| 1139 |
+
"width": 816,
|
| 1140 |
+
"height": 1056,
|
| 1141 |
+
"pageNumber": 5
|
| 1142 |
+
},
|
| 1143 |
+
{
|
| 1144 |
+
"x1": 509.25,
|
| 1145 |
+
"y1": 224.6999969482422,
|
| 1146 |
+
"x2": 515.1166534423828,
|
| 1147 |
+
"y2": 235.6999969482422,
|
| 1148 |
+
"width": 816,
|
| 1149 |
+
"height": 1056,
|
| 1150 |
+
"pageNumber": 5
|
| 1151 |
+
},
|
| 1152 |
+
{
|
| 1153 |
+
"x1": 550.4666748046875,
|
| 1154 |
+
"y1": 224.6999969482422,
|
| 1155 |
+
"x2": 556.3999938964844,
|
| 1156 |
+
"y2": 235.6999969482422,
|
| 1157 |
+
"width": 816,
|
| 1158 |
+
"height": 1056,
|
| 1159 |
+
"pageNumber": 5
|
| 1160 |
+
},
|
| 1161 |
+
{
|
| 1162 |
+
"x1": 467.7166748046875,
|
| 1163 |
+
"y1": 225.13333129882812,
|
| 1164 |
+
"x2": 473.5833435058594,
|
| 1165 |
+
"y2": 236.13333129882812,
|
| 1166 |
+
"width": 816,
|
| 1167 |
+
"height": 1056,
|
| 1168 |
+
"pageNumber": 5
|
| 1169 |
+
},
|
| 1170 |
+
{
|
| 1171 |
+
"x1": 143.36666870117188,
|
| 1172 |
+
"y1": 272.1666717529297,
|
| 1173 |
+
"x2": 627.4166870117188,
|
| 1174 |
+
"y2": 290.1666717529297,
|
| 1175 |
+
"width": 816,
|
| 1176 |
+
"height": 1056,
|
| 1177 |
+
"pageNumber": 5
|
| 1178 |
+
},
|
| 1179 |
+
{
|
| 1180 |
+
"x1": 619.4166870117188,
|
| 1181 |
+
"y1": 273.9499969482422,
|
| 1182 |
+
"x2": 668.6000366210938,
|
| 1183 |
+
"y2": 286.9499969482422,
|
| 1184 |
+
"width": 816,
|
| 1185 |
+
"height": 1056,
|
| 1186 |
+
"pageNumber": 5
|
| 1187 |
+
},
|
| 1188 |
+
{
|
| 1189 |
+
"x1": 467.066650390625,
|
| 1190 |
+
"y1": 275.1333465576172,
|
| 1191 |
+
"x2": 473.63336181640625,
|
| 1192 |
+
"y2": 283.1333465576172,
|
| 1193 |
+
"width": 816,
|
| 1194 |
+
"height": 1056,
|
| 1195 |
+
"pageNumber": 5
|
| 1196 |
+
},
|
| 1197 |
+
{
|
| 1198 |
+
"x1": 566.8666381835938,
|
| 1199 |
+
"y1": 275.1333465576172,
|
| 1200 |
+
"x2": 573.433349609375,
|
| 1201 |
+
"y2": 283.1333465576172,
|
| 1202 |
+
"width": 816,
|
| 1203 |
+
"height": 1056,
|
| 1204 |
+
"pageNumber": 5
|
| 1205 |
+
},
|
| 1206 |
+
{
|
| 1207 |
+
"x1": 492.2833251953125,
|
| 1208 |
+
"y1": 279.71665954589844,
|
| 1209 |
+
"x2": 498.13331604003906,
|
| 1210 |
+
"y2": 290.71665954589844,
|
| 1211 |
+
"width": 816,
|
| 1212 |
+
"height": 1056,
|
| 1213 |
+
"pageNumber": 5
|
| 1214 |
+
},
|
| 1215 |
+
{
|
| 1216 |
+
"x1": 592.0833129882812,
|
| 1217 |
+
"y1": 279.71665954589844,
|
| 1218 |
+
"x2": 597.9333038330078,
|
| 1219 |
+
"y2": 290.71665954589844,
|
| 1220 |
+
"width": 816,
|
| 1221 |
+
"height": 1056,
|
| 1222 |
+
"pageNumber": 5
|
| 1223 |
+
},
|
| 1224 |
+
{
|
| 1225 |
+
"x1": 393.7166748046875,
|
| 1226 |
+
"y1": 279.93333435058594,
|
| 1227 |
+
"x2": 399.56666564941406,
|
| 1228 |
+
"y2": 290.93333435058594,
|
| 1229 |
+
"width": 816,
|
| 1230 |
+
"height": 1056,
|
| 1231 |
+
"pageNumber": 5
|
| 1232 |
+
},
|
| 1233 |
+
{
|
| 1234 |
+
"x1": 193.63333129882812,
|
| 1235 |
+
"y1": 287.43333435058594,
|
| 1236 |
+
"x2": 268.3000183105469,
|
| 1237 |
+
"y2": 305.43333435058594,
|
| 1238 |
+
"width": 816,
|
| 1239 |
+
"height": 1056,
|
| 1240 |
+
"pageNumber": 5
|
| 1241 |
+
},
|
| 1242 |
+
{
|
| 1243 |
+
"x1": 180.90000915527344,
|
| 1244 |
+
"y1": 287.74998474121094,
|
| 1245 |
+
"x2": 191.08334350585938,
|
| 1246 |
+
"y2": 298.74998474121094,
|
| 1247 |
+
"width": 816,
|
| 1248 |
+
"height": 1056,
|
| 1249 |
+
"pageNumber": 5
|
| 1250 |
+
},
|
| 1251 |
+
{
|
| 1252 |
+
"x1": 144.0166778564453,
|
| 1253 |
+
"y1": 288.43333435058594,
|
| 1254 |
+
"x2": 182.0166778564453,
|
| 1255 |
+
"y2": 305.43333435058594,
|
| 1256 |
+
"width": 816,
|
| 1257 |
+
"height": 1056,
|
| 1258 |
+
"pageNumber": 5
|
| 1259 |
+
},
|
| 1260 |
+
{
|
| 1261 |
+
"x1": 144.0166778564453,
|
| 1262 |
+
"y1": 310.3000030517578,
|
| 1263 |
+
"x2": 672.3000030517578,
|
| 1264 |
+
"y2": 327.3000030517578,
|
| 1265 |
+
"width": 816,
|
| 1266 |
+
"height": 1056,
|
| 1267 |
+
"pageNumber": 5
|
| 1268 |
+
},
|
| 1269 |
+
{
|
| 1270 |
+
"x1": 144.0166778564453,
|
| 1271 |
+
"y1": 324.86668395996094,
|
| 1272 |
+
"x2": 151,
|
| 1273 |
+
"y2": 341.86668395996094,
|
| 1274 |
+
"width": 816,
|
| 1275 |
+
"height": 1056,
|
| 1276 |
+
"pageNumber": 5
|
| 1277 |
+
},
|
| 1278 |
+
{
|
| 1279 |
+
"x1": 161.15000915527344,
|
| 1280 |
+
"y1": 324.86668395996094,
|
| 1281 |
+
"x2": 213.10000610351562,
|
| 1282 |
+
"y2": 341.86668395996094,
|
| 1283 |
+
"width": 816,
|
| 1284 |
+
"height": 1056,
|
| 1285 |
+
"pageNumber": 5
|
| 1286 |
+
},
|
| 1287 |
+
{
|
| 1288 |
+
"x1": 236.95001220703125,
|
| 1289 |
+
"y1": 324.86668395996094,
|
| 1290 |
+
"x2": 672.8666687011719,
|
| 1291 |
+
"y2": 341.86668395996094,
|
| 1292 |
+
"width": 816,
|
| 1293 |
+
"height": 1056,
|
| 1294 |
+
"pageNumber": 5
|
| 1295 |
+
},
|
| 1296 |
+
{
|
| 1297 |
+
"x1": 150.86666870117188,
|
| 1298 |
+
"y1": 330.93333435058594,
|
| 1299 |
+
"x2": 158.68333435058594,
|
| 1300 |
+
"y2": 341.93333435058594,
|
| 1301 |
+
"width": 816,
|
| 1302 |
+
"height": 1056,
|
| 1303 |
+
"pageNumber": 5
|
| 1304 |
+
},
|
| 1305 |
+
{
|
| 1306 |
+
"x1": 182.03334045410156,
|
| 1307 |
+
"y1": 330.93333435058594,
|
| 1308 |
+
"x2": 189.53334045410156,
|
| 1309 |
+
"y2": 341.93333435058594,
|
| 1310 |
+
"width": 816,
|
| 1311 |
+
"height": 1056,
|
| 1312 |
+
"pageNumber": 5
|
| 1313 |
+
},
|
| 1314 |
+
{
|
| 1315 |
+
"x1": 213.04998779296875,
|
| 1316 |
+
"y1": 330.93333435058594,
|
| 1317 |
+
"x2": 236.43331909179688,
|
| 1318 |
+
"y2": 341.93333435058594,
|
| 1319 |
+
"width": 816,
|
| 1320 |
+
"height": 1056,
|
| 1321 |
+
"pageNumber": 5
|
| 1322 |
+
},
|
| 1323 |
+
{
|
| 1324 |
+
"x1": 144.0166778564453,
|
| 1325 |
+
"y1": 339.43333435058594,
|
| 1326 |
+
"x2": 493.1666717529297,
|
| 1327 |
+
"y2": 356.43333435058594,
|
| 1328 |
+
"width": 816,
|
| 1329 |
+
"height": 1056,
|
| 1330 |
+
"pageNumber": 5
|
| 1331 |
+
}
|
| 1332 |
+
]
|
| 1333 |
+
},
|
| 1334 |
+
"content": {
|
| 1335 |
+
"text": "output values. These are concatenated and once again projected, resulting in the final values, as\r depicted in Figure 2.\r Multi-head attention allows the model to jointly attend to information from different representation\r subspaces at different positions. With a single attention head, averaging inhibits this.\r MultiHead(Q, K, V ) = Concat(head1, ..., headh)W O\r where headi = Attention(QW Q\r i , KW K\r i , V W V\r i )\r Where the projections are parameter matrices W Q\r i ∈ Rdmodel×dk , W K\r i ∈ Rdmodel×dk , W V\r i ∈ Rdmodel×dv\r and W O ∈ Rhdv ×dmodel .\r In this work we employ h = 8 parallel attention layers, or heads. For each of these we use\r dk = dv = dmodel/h = 64. Due to the reduced dimension of each head, the total computational cost\r is similar to that of single-head attention with full dimensionality."
|
| 1336 |
+
}
|
| 1337 |
+
}],
|
| 1338 |
+
"4": [{
|
| 1339 |
+
"id": "highlight_1755776822210",
|
| 1340 |
+
"position": {
|
| 1341 |
+
"boundingRect": {
|
| 1342 |
+
"x1": 143.60000610351562,
|
| 1343 |
+
"y1": 397,
|
| 1344 |
+
"x2": 674.7833557128906,
|
| 1345 |
+
"y2": 644.1000213623047,
|
| 1346 |
+
"width": 816,
|
| 1347 |
+
"height": 1056,
|
| 1348 |
+
"pageNumber": 5
|
| 1349 |
+
},
|
| 1350 |
+
"rects": [
|
| 1351 |
+
{
|
| 1352 |
+
"x1": 143.60000610351562,
|
| 1353 |
+
"y1": 397,
|
| 1354 |
+
"x2": 497.6833190917969,
|
| 1355 |
+
"y2": 414,
|
| 1356 |
+
"width": 816,
|
| 1357 |
+
"height": 1056,
|
| 1358 |
+
"pageNumber": 5
|
| 1359 |
+
},
|
| 1360 |
+
{
|
| 1361 |
+
"x1": 180.48333740234375,
|
| 1362 |
+
"y1": 424.8666687011719,
|
| 1363 |
+
"x2": 674.7833557128906,
|
| 1364 |
+
"y2": 441.8666687011719,
|
| 1365 |
+
"width": 816,
|
| 1366 |
+
"height": 1056,
|
| 1367 |
+
"pageNumber": 5
|
| 1368 |
+
},
|
| 1369 |
+
{
|
| 1370 |
+
"x1": 191.83334350585938,
|
| 1371 |
+
"y1": 439.3333435058594,
|
| 1372 |
+
"x2": 673.6333312988281,
|
| 1373 |
+
"y2": 456.3333435058594,
|
| 1374 |
+
"width": 816,
|
| 1375 |
+
"height": 1056,
|
| 1376 |
+
"pageNumber": 5
|
| 1377 |
+
},
|
| 1378 |
+
{
|
| 1379 |
+
"x1": 191.83334350585938,
|
| 1380 |
+
"y1": 453.9166717529297,
|
| 1381 |
+
"x2": 673.1999816894531,
|
| 1382 |
+
"y2": 470.9166717529297,
|
| 1383 |
+
"width": 816,
|
| 1384 |
+
"height": 1056,
|
| 1385 |
+
"pageNumber": 5
|
| 1386 |
+
},
|
| 1387 |
+
{
|
| 1388 |
+
"x1": 191.83334350585938,
|
| 1389 |
+
"y1": 468.48333740234375,
|
| 1390 |
+
"x2": 673.2333374023438,
|
| 1391 |
+
"y2": 485.48333740234375,
|
| 1392 |
+
"width": 816,
|
| 1393 |
+
"height": 1056,
|
| 1394 |
+
"pageNumber": 5
|
| 1395 |
+
},
|
| 1396 |
+
{
|
| 1397 |
+
"x1": 191.83334350585938,
|
| 1398 |
+
"y1": 482.95001220703125,
|
| 1399 |
+
"x2": 243.9499969482422,
|
| 1400 |
+
"y2": 499.95001220703125,
|
| 1401 |
+
"width": 816,
|
| 1402 |
+
"height": 1056,
|
| 1403 |
+
"pageNumber": 5
|
| 1404 |
+
},
|
| 1405 |
+
{
|
| 1406 |
+
"x1": 180.48333740234375,
|
| 1407 |
+
"y1": 504.06666564941406,
|
| 1408 |
+
"x2": 673.2000122070312,
|
| 1409 |
+
"y2": 521.0666656494141,
|
| 1410 |
+
"width": 816,
|
| 1411 |
+
"height": 1056,
|
| 1412 |
+
"pageNumber": 5
|
| 1413 |
+
},
|
| 1414 |
+
{
|
| 1415 |
+
"x1": 191.83334350585938,
|
| 1416 |
+
"y1": 518.6500091552734,
|
| 1417 |
+
"x2": 673.1999816894531,
|
| 1418 |
+
"y2": 535.6500091552734,
|
| 1419 |
+
"width": 816,
|
| 1420 |
+
"height": 1056,
|
| 1421 |
+
"pageNumber": 5
|
| 1422 |
+
},
|
| 1423 |
+
{
|
| 1424 |
+
"x1": 191.83334350585938,
|
| 1425 |
+
"y1": 533.2166595458984,
|
| 1426 |
+
"x2": 673.1333312988281,
|
| 1427 |
+
"y2": 550.2166595458984,
|
| 1428 |
+
"width": 816,
|
| 1429 |
+
"height": 1056,
|
| 1430 |
+
"pageNumber": 5
|
| 1431 |
+
},
|
| 1432 |
+
{
|
| 1433 |
+
"x1": 191.83334350585938,
|
| 1434 |
+
"y1": 547.7833404541016,
|
| 1435 |
+
"x2": 236.56666564941406,
|
| 1436 |
+
"y2": 564.7833404541016,
|
| 1437 |
+
"width": 816,
|
| 1438 |
+
"height": 1056,
|
| 1439 |
+
"pageNumber": 5
|
| 1440 |
+
},
|
| 1441 |
+
{
|
| 1442 |
+
"x1": 180.48333740234375,
|
| 1443 |
+
"y1": 568.9166717529297,
|
| 1444 |
+
"x2": 673.2166748046875,
|
| 1445 |
+
"y2": 585.9166717529297,
|
| 1446 |
+
"width": 816,
|
| 1447 |
+
"height": 1056,
|
| 1448 |
+
"pageNumber": 5
|
| 1449 |
+
},
|
| 1450 |
+
{
|
| 1451 |
+
"x1": 191.83334350585938,
|
| 1452 |
+
"y1": 583.3833465576172,
|
| 1453 |
+
"x2": 673.1833190917969,
|
| 1454 |
+
"y2": 600.3833465576172,
|
| 1455 |
+
"width": 816,
|
| 1456 |
+
"height": 1056,
|
| 1457 |
+
"pageNumber": 5
|
| 1458 |
+
},
|
| 1459 |
+
{
|
| 1460 |
+
"x1": 191.83334350585938,
|
| 1461 |
+
"y1": 597.9499969482422,
|
| 1462 |
+
"x2": 673.1833190917969,
|
| 1463 |
+
"y2": 614.9499969482422,
|
| 1464 |
+
"width": 816,
|
| 1465 |
+
"height": 1056,
|
| 1466 |
+
"pageNumber": 5
|
| 1467 |
+
},
|
| 1468 |
+
{
|
| 1469 |
+
"x1": 191.83334350585938,
|
| 1470 |
+
"y1": 612.5166778564453,
|
| 1471 |
+
"x2": 672.3333740234375,
|
| 1472 |
+
"y2": 629.5166778564453,
|
| 1473 |
+
"width": 816,
|
| 1474 |
+
"height": 1056,
|
| 1475 |
+
"pageNumber": 5
|
| 1476 |
+
},
|
| 1477 |
+
{
|
| 1478 |
+
"x1": 191.83334350585938,
|
| 1479 |
+
"y1": 627.1000213623047,
|
| 1480 |
+
"x2": 486.8000183105469,
|
| 1481 |
+
"y2": 644.1000213623047,
|
| 1482 |
+
"width": 816,
|
| 1483 |
+
"height": 1056,
|
| 1484 |
+
"pageNumber": 5
|
| 1485 |
+
}
|
| 1486 |
+
]
|
| 1487 |
+
},
|
| 1488 |
+
"content": {
|
| 1489 |
+
"text": "The Transformer uses multi-head attention in three different ways:\r • In \"encoder-decoder attention\" layers, the queries come from the previous decoder layer,\r and the memory keys and values come from the output of the encoder. This allows every\r position in the decoder to attend over all positions in the input sequence. This mimics the\r typical encoder-decoder attention mechanisms in sequence-to-sequence models such as\r [38, 2, 9].\r • The encoder contains self-attention layers. In a self-attention layer all of the keys, values\r and queries come from the same place, in this case, the output of the previous layer in the\r encoder. Each position in the encoder can attend to all positions in the previous layer of the\r encoder.\r • Similarly, self-attention layers in the decoder allow each position in the decoder to attend to\r all positions in the decoder up to and including that position. We need to prevent leftward\r information flow in the decoder to preserve the auto-regressive property. We implement this\r inside of scaled dot-product attention by masking out (setting to −∞) all values in the input\r of the softmax which correspond to illegal connections"
|
| 1490 |
+
}
|
| 1491 |
+
}],
|
| 1492 |
+
"5": [{
|
| 1493 |
+
"id": "highlight_1755776852579",
|
| 1494 |
+
"position": {
|
| 1495 |
+
"boundingRect": {
|
| 1496 |
+
"x1": 143.36666870117188,
|
| 1497 |
+
"y1": 688.2333374023438,
|
| 1498 |
+
"x2": 675.36669921875,
|
| 1499 |
+
"y2": 845.4333190917969,
|
| 1500 |
+
"width": 816,
|
| 1501 |
+
"height": 1056,
|
| 1502 |
+
"pageNumber": 5
|
| 1503 |
+
},
|
| 1504 |
+
"rects": [
|
| 1505 |
+
{
|
| 1506 |
+
"x1": 144.0166778564453,
|
| 1507 |
+
"y1": 688.2333374023438,
|
| 1508 |
+
"x2": 673.6833038330078,
|
| 1509 |
+
"y2": 705.2333374023438,
|
| 1510 |
+
"width": 816,
|
| 1511 |
+
"height": 1056,
|
| 1512 |
+
"pageNumber": 5
|
| 1513 |
+
},
|
| 1514 |
+
{
|
| 1515 |
+
"x1": 144.0166778564453,
|
| 1516 |
+
"y1": 702.8166656494141,
|
| 1517 |
+
"x2": 673.3166656494141,
|
| 1518 |
+
"y2": 719.8166656494141,
|
| 1519 |
+
"width": 816,
|
| 1520 |
+
"height": 1056,
|
| 1521 |
+
"pageNumber": 5
|
| 1522 |
+
},
|
| 1523 |
+
{
|
| 1524 |
+
"x1": 144.0166778564453,
|
| 1525 |
+
"y1": 717.3833312988281,
|
| 1526 |
+
"x2": 536.3666534423828,
|
| 1527 |
+
"y2": 734.3833312988281,
|
| 1528 |
+
"width": 816,
|
| 1529 |
+
"height": 1056,
|
| 1530 |
+
"pageNumber": 5
|
| 1531 |
+
},
|
| 1532 |
+
{
|
| 1533 |
+
"x1": 302.566650390625,
|
| 1534 |
+
"y1": 755.6166687011719,
|
| 1535 |
+
"x2": 508.81663513183594,
|
| 1536 |
+
"y2": 772.6166687011719,
|
| 1537 |
+
"width": 816,
|
| 1538 |
+
"height": 1056,
|
| 1539 |
+
"pageNumber": 5
|
| 1540 |
+
},
|
| 1541 |
+
{
|
| 1542 |
+
"x1": 657.3666381835938,
|
| 1543 |
+
"y1": 755.6166687011719,
|
| 1544 |
+
"x2": 672.8999786376953,
|
| 1545 |
+
"y2": 772.6166687011719,
|
| 1546 |
+
"width": 816,
|
| 1547 |
+
"height": 1056,
|
| 1548 |
+
"pageNumber": 5
|
| 1549 |
+
},
|
| 1550 |
+
{
|
| 1551 |
+
"x1": 428.066650390625,
|
| 1552 |
+
"y1": 761.7833557128906,
|
| 1553 |
+
"x2": 435.4666748046875,
|
| 1554 |
+
"y2": 772.7833557128906,
|
| 1555 |
+
"width": 816,
|
| 1556 |
+
"height": 1056,
|
| 1557 |
+
"pageNumber": 5
|
| 1558 |
+
},
|
| 1559 |
+
{
|
| 1560 |
+
"x1": 455.9666748046875,
|
| 1561 |
+
"y1": 761.7833557128906,
|
| 1562 |
+
"x2": 460.68333435058594,
|
| 1563 |
+
"y2": 772.7833557128906,
|
| 1564 |
+
"width": 816,
|
| 1565 |
+
"height": 1056,
|
| 1566 |
+
"pageNumber": 5
|
| 1567 |
+
},
|
| 1568 |
+
{
|
| 1569 |
+
"x1": 479.63336181640625,
|
| 1570 |
+
"y1": 761.7833557128906,
|
| 1571 |
+
"x2": 486.9666442871094,
|
| 1572 |
+
"y2": 772.7833557128906,
|
| 1573 |
+
"width": 816,
|
| 1574 |
+
"height": 1056,
|
| 1575 |
+
"pageNumber": 5
|
| 1576 |
+
},
|
| 1577 |
+
{
|
| 1578 |
+
"x1": 507.4666748046875,
|
| 1579 |
+
"y1": 761.7833557128906,
|
| 1580 |
+
"x2": 514.8666534423828,
|
| 1581 |
+
"y2": 772.7833557128906,
|
| 1582 |
+
"width": 816,
|
| 1583 |
+
"height": 1056,
|
| 1584 |
+
"pageNumber": 5
|
| 1585 |
+
},
|
| 1586 |
+
{
|
| 1587 |
+
"x1": 143.36666870117188,
|
| 1588 |
+
"y1": 784.6499938964844,
|
| 1589 |
+
"x2": 673.3000183105469,
|
| 1590 |
+
"y2": 801.6499938964844,
|
| 1591 |
+
"width": 816,
|
| 1592 |
+
"height": 1056,
|
| 1593 |
+
"pageNumber": 5
|
| 1594 |
+
},
|
| 1595 |
+
{
|
| 1596 |
+
"x1": 144.0166778564453,
|
| 1597 |
+
"y1": 799.2166748046875,
|
| 1598 |
+
"x2": 675.36669921875,
|
| 1599 |
+
"y2": 816.2166748046875,
|
| 1600 |
+
"width": 816,
|
| 1601 |
+
"height": 1056,
|
| 1602 |
+
"pageNumber": 5
|
| 1603 |
+
},
|
| 1604 |
+
{
|
| 1605 |
+
"x1": 143.60000610351562,
|
| 1606 |
+
"y1": 813.8000183105469,
|
| 1607 |
+
"x2": 386.24998474121094,
|
| 1608 |
+
"y2": 830.8000183105469,
|
| 1609 |
+
"width": 816,
|
| 1610 |
+
"height": 1056,
|
| 1611 |
+
"pageNumber": 5
|
| 1612 |
+
},
|
| 1613 |
+
{
|
| 1614 |
+
"x1": 416.066650390625,
|
| 1615 |
+
"y1": 813.8000183105469,
|
| 1616 |
+
"x2": 673.0833435058594,
|
| 1617 |
+
"y2": 830.8000183105469,
|
| 1618 |
+
"width": 816,
|
| 1619 |
+
"height": 1056,
|
| 1620 |
+
"pageNumber": 5
|
| 1621 |
+
},
|
| 1622 |
+
{
|
| 1623 |
+
"x1": 386.20001220703125,
|
| 1624 |
+
"y1": 819.8666687011719,
|
| 1625 |
+
"x2": 411.56666564941406,
|
| 1626 |
+
"y2": 830.8666687011719,
|
| 1627 |
+
"width": 816,
|
| 1628 |
+
"height": 1056,
|
| 1629 |
+
"pageNumber": 5
|
| 1630 |
+
},
|
| 1631 |
+
{
|
| 1632 |
+
"x1": 144.0166778564453,
|
| 1633 |
+
"y1": 828.3666687011719,
|
| 1634 |
+
"x2": 151,
|
| 1635 |
+
"y2": 845.3666687011719,
|
| 1636 |
+
"width": 816,
|
| 1637 |
+
"height": 1056,
|
| 1638 |
+
"pageNumber": 5
|
| 1639 |
+
},
|
| 1640 |
+
{
|
| 1641 |
+
"x1": 167.68333435058594,
|
| 1642 |
+
"y1": 828.3666687011719,
|
| 1643 |
+
"x2": 211.66665649414062,
|
| 1644 |
+
"y2": 845.3666687011719,
|
| 1645 |
+
"width": 816,
|
| 1646 |
+
"height": 1056,
|
| 1647 |
+
"pageNumber": 5
|
| 1648 |
+
},
|
| 1649 |
+
{
|
| 1650 |
+
"x1": 150.86666870117188,
|
| 1651 |
+
"y1": 834.4333190917969,
|
| 1652 |
+
"x2": 164.40000915527344,
|
| 1653 |
+
"y2": 845.4333190917969,
|
| 1654 |
+
"width": 816,
|
| 1655 |
+
"height": 1056,
|
| 1656 |
+
"pageNumber": 5
|
| 1657 |
+
}
|
| 1658 |
+
]
|
| 1659 |
+
},
|
| 1660 |
+
"content": {
|
| 1661 |
+
"text": "In addition to attention sub-layers, each of the layers in our encoder and decoder contains a fully\r connected feed-forward network, which is applied to each position separately and identically. This\r consists of two linear transformations with a ReLU activation in between.\r FFN(x) = max(0, xW1 + b1)W2 + b2 (2)\r While the linear transformations are the same across different positions, they use different parameters\r from layer to layer. Another way of describing this is as two convolutions with kernel size 1.\r The dimensionality of input and output is dmodel = 512, and the inner-layer has dimensionality\r df f = 2048."
|
| 1662 |
+
}
|
| 1663 |
+
}],
|
| 1664 |
+
"6": [{
|
| 1665 |
+
"id": "highlight_1755776884173",
|
| 1666 |
+
"position": {
|
| 1667 |
+
"boundingRect": {
|
| 1668 |
+
"x1": 144.0166778564453,
|
| 1669 |
+
"y1": 889.5166473388672,
|
| 1670 |
+
"x2": 674.816650390625,
|
| 1671 |
+
"y2": 964.8666381835938,
|
| 1672 |
+
"width": 816,
|
| 1673 |
+
"height": 1056,
|
| 1674 |
+
"pageNumber": 5
|
| 1675 |
+
},
|
| 1676 |
+
"rects": [
|
| 1677 |
+
{
|
| 1678 |
+
"x1": 144.0166778564453,
|
| 1679 |
+
"y1": 889.5166473388672,
|
| 1680 |
+
"x2": 673.3999786376953,
|
| 1681 |
+
"y2": 906.5166473388672,
|
| 1682 |
+
"width": 816,
|
| 1683 |
+
"height": 1056,
|
| 1684 |
+
"pageNumber": 5
|
| 1685 |
+
},
|
| 1686 |
+
{
|
| 1687 |
+
"x1": 144.0166778564453,
|
| 1688 |
+
"y1": 904.0833129882812,
|
| 1689 |
+
"x2": 409.183349609375,
|
| 1690 |
+
"y2": 921.0833129882812,
|
| 1691 |
+
"width": 816,
|
| 1692 |
+
"height": 1056,
|
| 1693 |
+
"pageNumber": 5
|
| 1694 |
+
},
|
| 1695 |
+
{
|
| 1696 |
+
"x1": 433.04998779296875,
|
| 1697 |
+
"y1": 904.0833129882812,
|
| 1698 |
+
"x2": 674.816650390625,
|
| 1699 |
+
"y2": 921.0833129882812,
|
| 1700 |
+
"width": 816,
|
| 1701 |
+
"height": 1056,
|
| 1702 |
+
"pageNumber": 5
|
| 1703 |
+
},
|
| 1704 |
+
{
|
| 1705 |
+
"x1": 409.1333312988281,
|
| 1706 |
+
"y1": 910.2666473388672,
|
| 1707 |
+
"x2": 432.5166778564453,
|
| 1708 |
+
"y2": 921.2666473388672,
|
| 1709 |
+
"width": 816,
|
| 1710 |
+
"height": 1056,
|
| 1711 |
+
"pageNumber": 5
|
| 1712 |
+
},
|
| 1713 |
+
{
|
| 1714 |
+
"x1": 144.0166778564453,
|
| 1715 |
+
"y1": 918.6499786376953,
|
| 1716 |
+
"x2": 673.2333526611328,
|
| 1717 |
+
"y2": 935.6499786376953,
|
| 1718 |
+
"width": 816,
|
| 1719 |
+
"height": 1056,
|
| 1720 |
+
"pageNumber": 5
|
| 1721 |
+
},
|
| 1722 |
+
{
|
| 1723 |
+
"x1": 144.0166778564453,
|
| 1724 |
+
"y1": 933.2333068847656,
|
| 1725 |
+
"x2": 673.7166900634766,
|
| 1726 |
+
"y2": 950.2333068847656,
|
| 1727 |
+
"width": 816,
|
| 1728 |
+
"height": 1056,
|
| 1729 |
+
"pageNumber": 5
|
| 1730 |
+
},
|
| 1731 |
+
{
|
| 1732 |
+
"x1": 629.1333618164062,
|
| 1733 |
+
"y1": 939.2999725341797,
|
| 1734 |
+
"x2": 635.7500305175781,
|
| 1735 |
+
"y2": 952.5833129882812,
|
| 1736 |
+
"width": 816,
|
| 1737 |
+
"height": 1056,
|
| 1738 |
+
"pageNumber": 5
|
| 1739 |
+
},
|
| 1740 |
+
{
|
| 1741 |
+
"x1": 144.0166778564453,
|
| 1742 |
+
"y1": 947.6999816894531,
|
| 1743 |
+
"x2": 629.0333251953125,
|
| 1744 |
+
"y2": 964.6999816894531,
|
| 1745 |
+
"width": 816,
|
| 1746 |
+
"height": 1056,
|
| 1747 |
+
"pageNumber": 5
|
| 1748 |
+
},
|
| 1749 |
+
{
|
| 1750 |
+
"x1": 640.2333374023438,
|
| 1751 |
+
"y1": 947.6999816894531,
|
| 1752 |
+
"x2": 647.2166748046875,
|
| 1753 |
+
"y2": 964.6999816894531,
|
| 1754 |
+
"width": 816,
|
| 1755 |
+
"height": 1056,
|
| 1756 |
+
"pageNumber": 5
|
| 1757 |
+
},
|
| 1758 |
+
{
|
| 1759 |
+
"x1": 671.066650390625,
|
| 1760 |
+
"y1": 947.6999816894531,
|
| 1761 |
+
"x2": 674.4166564941406,
|
| 1762 |
+
"y2": 964.6999816894531,
|
| 1763 |
+
"width": 816,
|
| 1764 |
+
"height": 1056,
|
| 1765 |
+
"pageNumber": 5
|
| 1766 |
+
},
|
| 1767 |
+
{
|
| 1768 |
+
"x1": 647.1666870117188,
|
| 1769 |
+
"y1": 953.8666381835938,
|
| 1770 |
+
"x2": 670.5500183105469,
|
| 1771 |
+
"y2": 964.8666381835938,
|
| 1772 |
+
"width": 816,
|
| 1773 |
+
"height": 1056,
|
| 1774 |
+
"pageNumber": 5
|
| 1775 |
+
}
|
| 1776 |
+
]
|
| 1777 |
+
},
|
| 1778 |
+
"content": {
|
| 1779 |
+
"text": "Similarly to other sequence transduction models, we use learned embeddings to convert the input\r tokens and output tokens to vectors of dimension dmodel. We also use the usual learned linear transfor-\r mation and softmax function to convert the decoder output to predicted next-token probabilities. In\r our model, we share the same weight matrix between the two embedding layers and the pre-softmax\r linear transformation, similar to [ 30 ]. In the embedding layers, we multiply those weights by √dmodel."
|
| 1780 |
+
}
|
| 1781 |
+
}, {
|
| 1782 |
+
"id": "highlight_1755776906056",
|
| 1783 |
+
"position": {
|
| 1784 |
+
"boundingRect": {
|
| 1785 |
+
"x1": 143.36666870117188,
|
| 1786 |
+
"y1": 309.6666717529297,
|
| 1787 |
+
"x2": 674.9500274658203,
|
| 1788 |
+
"y2": 641.8833312988281,
|
| 1789 |
+
"width": 816,
|
| 1790 |
+
"height": 1056,
|
| 1791 |
+
"pageNumber": 6
|
| 1792 |
+
},
|
| 1793 |
+
"rects": [
|
| 1794 |
+
{
|
| 1795 |
+
"x1": 144.0166778564453,
|
| 1796 |
+
"y1": 309.6666717529297,
|
| 1797 |
+
"x2": 673.2500152587891,
|
| 1798 |
+
"y2": 326.6666717529297,
|
| 1799 |
+
"width": 816,
|
| 1800 |
+
"height": 1056,
|
| 1801 |
+
"pageNumber": 6
|
| 1802 |
+
},
|
| 1803 |
+
{
|
| 1804 |
+
"x1": 144.0166778564453,
|
| 1805 |
+
"y1": 324.1333312988281,
|
| 1806 |
+
"x2": 673.3166656494141,
|
| 1807 |
+
"y2": 341.1333312988281,
|
| 1808 |
+
"width": 816,
|
| 1809 |
+
"height": 1056,
|
| 1810 |
+
"pageNumber": 6
|
| 1811 |
+
},
|
| 1812 |
+
{
|
| 1813 |
+
"x1": 144.0166778564453,
|
| 1814 |
+
"y1": 338.6999969482422,
|
| 1815 |
+
"x2": 673.2833404541016,
|
| 1816 |
+
"y2": 355.6999969482422,
|
| 1817 |
+
"width": 816,
|
| 1818 |
+
"height": 1056,
|
| 1819 |
+
"pageNumber": 6
|
| 1820 |
+
},
|
| 1821 |
+
{
|
| 1822 |
+
"x1": 144.0166778564453,
|
| 1823 |
+
"y1": 353.2666778564453,
|
| 1824 |
+
"x2": 648.183349609375,
|
| 1825 |
+
"y2": 370.2666778564453,
|
| 1826 |
+
"width": 816,
|
| 1827 |
+
"height": 1056,
|
| 1828 |
+
"pageNumber": 6
|
| 1829 |
+
},
|
| 1830 |
+
{
|
| 1831 |
+
"x1": 648.066650390625,
|
| 1832 |
+
"y1": 359.3500061035156,
|
| 1833 |
+
"x2": 671.4499816894531,
|
| 1834 |
+
"y2": 370.3500061035156,
|
| 1835 |
+
"width": 816,
|
| 1836 |
+
"height": 1056,
|
| 1837 |
+
"pageNumber": 6
|
| 1838 |
+
},
|
| 1839 |
+
{
|
| 1840 |
+
"x1": 144.0166778564453,
|
| 1841 |
+
"y1": 367.75,
|
| 1842 |
+
"x2": 674.9500274658203,
|
| 1843 |
+
"y2": 384.75,
|
| 1844 |
+
"width": 816,
|
| 1845 |
+
"height": 1056,
|
| 1846 |
+
"pageNumber": 6
|
| 1847 |
+
},
|
| 1848 |
+
{
|
| 1849 |
+
"x1": 144.0166778564453,
|
| 1850 |
+
"y1": 382.31666564941406,
|
| 1851 |
+
"x2": 257.68333435058594,
|
| 1852 |
+
"y2": 399.31666564941406,
|
| 1853 |
+
"width": 816,
|
| 1854 |
+
"height": 1056,
|
| 1855 |
+
"pageNumber": 6
|
| 1856 |
+
},
|
| 1857 |
+
{
|
| 1858 |
+
"x1": 144.0166778564453,
|
| 1859 |
+
"y1": 404.1666717529297,
|
| 1860 |
+
"x2": 520.7166900634766,
|
| 1861 |
+
"y2": 421.1666717529297,
|
| 1862 |
+
"width": 816,
|
| 1863 |
+
"height": 1056,
|
| 1864 |
+
"pageNumber": 6
|
| 1865 |
+
},
|
| 1866 |
+
{
|
| 1867 |
+
"x1": 471.9666748046875,
|
| 1868 |
+
"y1": 447.29998779296875,
|
| 1869 |
+
"x2": 510.10003662109375,
|
| 1870 |
+
"y2": 458.29998779296875,
|
| 1871 |
+
"width": 816,
|
| 1872 |
+
"height": 1056,
|
| 1873 |
+
"pageNumber": 6
|
| 1874 |
+
},
|
| 1875 |
+
{
|
| 1876 |
+
"x1": 313.98333740234375,
|
| 1877 |
+
"y1": 448.73333740234375,
|
| 1878 |
+
"x2": 334.1999969482422,
|
| 1879 |
+
"y2": 465.73333740234375,
|
| 1880 |
+
"width": 816,
|
| 1881 |
+
"height": 1056,
|
| 1882 |
+
"pageNumber": 6
|
| 1883 |
+
},
|
| 1884 |
+
{
|
| 1885 |
+
"x1": 374.7833251953125,
|
| 1886 |
+
"y1": 448.73333740234375,
|
| 1887 |
+
"x2": 472.03334045410156,
|
| 1888 |
+
"y2": 465.73333740234375,
|
| 1889 |
+
"width": 816,
|
| 1890 |
+
"height": 1056,
|
| 1891 |
+
"pageNumber": 6
|
| 1892 |
+
},
|
| 1893 |
+
{
|
| 1894 |
+
"x1": 509.91668701171875,
|
| 1895 |
+
"y1": 448.73333740234375,
|
| 1896 |
+
"x2": 513.9500122070312,
|
| 1897 |
+
"y2": 465.73333740234375,
|
| 1898 |
+
"width": 816,
|
| 1899 |
+
"height": 1056,
|
| 1900 |
+
"pageNumber": 6
|
| 1901 |
+
},
|
| 1902 |
+
{
|
| 1903 |
+
"x1": 334.1500244140625,
|
| 1904 |
+
"y1": 455.23333740234375,
|
| 1905 |
+
"x2": 372.5500183105469,
|
| 1906 |
+
"y2": 466.23333740234375,
|
| 1907 |
+
"width": 816,
|
| 1908 |
+
"height": 1056,
|
| 1909 |
+
"pageNumber": 6
|
| 1910 |
+
},
|
| 1911 |
+
{
|
| 1912 |
+
"x1": 471.9666748046875,
|
| 1913 |
+
"y1": 469.79998779296875,
|
| 1914 |
+
"x2": 510.10003662109375,
|
| 1915 |
+
"y2": 480.79998779296875,
|
| 1916 |
+
"width": 816,
|
| 1917 |
+
"height": 1056,
|
| 1918 |
+
"pageNumber": 6
|
| 1919 |
+
},
|
| 1920 |
+
{
|
| 1921 |
+
"x1": 300.933349609375,
|
| 1922 |
+
"y1": 471.23333740234375,
|
| 1923 |
+
"x2": 321.1500244140625,
|
| 1924 |
+
"y2": 488.23333740234375,
|
| 1925 |
+
"width": 816,
|
| 1926 |
+
"height": 1056,
|
| 1927 |
+
"pageNumber": 6
|
| 1928 |
+
},
|
| 1929 |
+
{
|
| 1930 |
+
"x1": 375.1000061035156,
|
| 1931 |
+
"y1": 471.23333740234375,
|
| 1932 |
+
"x2": 472.03334045410156,
|
| 1933 |
+
"y2": 488.23333740234375,
|
| 1934 |
+
"width": 816,
|
| 1935 |
+
"height": 1056,
|
| 1936 |
+
"pageNumber": 6
|
| 1937 |
+
},
|
| 1938 |
+
{
|
| 1939 |
+
"x1": 509.91668701171875,
|
| 1940 |
+
"y1": 471.23333740234375,
|
| 1941 |
+
"x2": 513.9500122070312,
|
| 1942 |
+
"y2": 488.23333740234375,
|
| 1943 |
+
"width": 816,
|
| 1944 |
+
"height": 1056,
|
| 1945 |
+
"pageNumber": 6
|
| 1946 |
+
},
|
| 1947 |
+
{
|
| 1948 |
+
"x1": 321.0833435058594,
|
| 1949 |
+
"y1": 477.7166748046875,
|
| 1950 |
+
"x2": 372.88331604003906,
|
| 1951 |
+
"y2": 488.7166748046875,
|
| 1952 |
+
"width": 816,
|
| 1953 |
+
"height": 1056,
|
| 1954 |
+
"pageNumber": 6
|
| 1955 |
+
},
|
| 1956 |
+
{
|
| 1957 |
+
"x1": 143.53334045410156,
|
| 1958 |
+
"y1": 501.1166687011719,
|
| 1959 |
+
"x2": 672.8666687011719,
|
| 1960 |
+
"y2": 518.1166687011719,
|
| 1961 |
+
"width": 816,
|
| 1962 |
+
"height": 1056,
|
| 1963 |
+
"pageNumber": 6
|
| 1964 |
+
},
|
| 1965 |
+
{
|
| 1966 |
+
"x1": 144.0166778564453,
|
| 1967 |
+
"y1": 515.683349609375,
|
| 1968 |
+
"x2": 672.0500183105469,
|
| 1969 |
+
"y2": 532.683349609375,
|
| 1970 |
+
"width": 816,
|
| 1971 |
+
"height": 1056,
|
| 1972 |
+
"pageNumber": 6
|
| 1973 |
+
},
|
| 1974 |
+
{
|
| 1975 |
+
"x1": 144.0166778564453,
|
| 1976 |
+
"y1": 530.2666625976562,
|
| 1977 |
+
"x2": 673.7833404541016,
|
| 1978 |
+
"y2": 547.2666625976562,
|
| 1979 |
+
"width": 816,
|
| 1980 |
+
"height": 1056,
|
| 1981 |
+
"pageNumber": 6
|
| 1982 |
+
},
|
| 1983 |
+
{
|
| 1984 |
+
"x1": 144.0166778564453,
|
| 1985 |
+
"y1": 544.8333435058594,
|
| 1986 |
+
"x2": 415.4000244140625,
|
| 1987 |
+
"y2": 561.8333435058594,
|
| 1988 |
+
"width": 816,
|
| 1989 |
+
"height": 1056,
|
| 1990 |
+
"pageNumber": 6
|
| 1991 |
+
},
|
| 1992 |
+
{
|
| 1993 |
+
"x1": 449.20001220703125,
|
| 1994 |
+
"y1": 544.8333435058594,
|
| 1995 |
+
"x2": 672.433349609375,
|
| 1996 |
+
"y2": 561.8333435058594,
|
| 1997 |
+
"width": 816,
|
| 1998 |
+
"height": 1056,
|
| 1999 |
+
"pageNumber": 6
|
| 2000 |
+
},
|
| 2001 |
+
{
|
| 2002 |
+
"x1": 415.3333435058594,
|
| 2003 |
+
"y1": 550.9000244140625,
|
| 2004 |
+
"x2": 446.9666748046875,
|
| 2005 |
+
"y2": 561.9000244140625,
|
| 2006 |
+
"width": 816,
|
| 2007 |
+
"height": 1056,
|
| 2008 |
+
"pageNumber": 6
|
| 2009 |
+
},
|
| 2010 |
+
{
|
| 2011 |
+
"x1": 144.0166778564453,
|
| 2012 |
+
"y1": 559.2999877929688,
|
| 2013 |
+
"x2": 164.23333740234375,
|
| 2014 |
+
"y2": 576.2999877929688,
|
| 2015 |
+
"width": 816,
|
| 2016 |
+
"height": 1056,
|
| 2017 |
+
"pageNumber": 6
|
| 2018 |
+
},
|
| 2019 |
+
{
|
| 2020 |
+
"x1": 180.56666564941406,
|
| 2021 |
+
"y1": 559.2999877929688,
|
| 2022 |
+
"x2": 183.9166717529297,
|
| 2023 |
+
"y2": 576.2999877929688,
|
| 2024 |
+
"width": 816,
|
| 2025 |
+
"height": 1056,
|
| 2026 |
+
"pageNumber": 6
|
| 2027 |
+
},
|
| 2028 |
+
{
|
| 2029 |
+
"x1": 164.1666717529297,
|
| 2030 |
+
"y1": 565.4666748046875,
|
| 2031 |
+
"x2": 179.9666748046875,
|
| 2032 |
+
"y2": 576.4666748046875,
|
| 2033 |
+
"width": 816,
|
| 2034 |
+
"height": 1056,
|
| 2035 |
+
"pageNumber": 6
|
| 2036 |
+
},
|
| 2037 |
+
{
|
| 2038 |
+
"x1": 143.36666870117188,
|
| 2039 |
+
"y1": 581.1666870117188,
|
| 2040 |
+
"x2": 672.3666381835938,
|
| 2041 |
+
"y2": 598.1666870117188,
|
| 2042 |
+
"width": 816,
|
| 2043 |
+
"height": 1056,
|
| 2044 |
+
"pageNumber": 6
|
| 2045 |
+
},
|
| 2046 |
+
{
|
| 2047 |
+
"x1": 143.68333435058594,
|
| 2048 |
+
"y1": 595.7333374023438,
|
| 2049 |
+
"x2": 673.3666839599609,
|
| 2050 |
+
"y2": 612.7333374023438,
|
| 2051 |
+
"width": 816,
|
| 2052 |
+
"height": 1056,
|
| 2053 |
+
"pageNumber": 6
|
| 2054 |
+
},
|
| 2055 |
+
{
|
| 2056 |
+
"x1": 144.0166778564453,
|
| 2057 |
+
"y1": 610.2999877929688,
|
| 2058 |
+
"x2": 673.2666778564453,
|
| 2059 |
+
"y2": 627.2999877929688,
|
| 2060 |
+
"width": 816,
|
| 2061 |
+
"height": 1056,
|
| 2062 |
+
"pageNumber": 6
|
| 2063 |
+
},
|
| 2064 |
+
{
|
| 2065 |
+
"x1": 144.0166778564453,
|
| 2066 |
+
"y1": 624.8833312988281,
|
| 2067 |
+
"x2": 226.90000915527344,
|
| 2068 |
+
"y2": 641.8833312988281,
|
| 2069 |
+
"width": 816,
|
| 2070 |
+
"height": 1056,
|
| 2071 |
+
"pageNumber": 6
|
| 2072 |
+
}
|
| 2073 |
+
]
|
| 2074 |
+
},
|
| 2075 |
+
"content": {
|
| 2076 |
+
"text": "Since our model contains no recurrence and no convolution, in order for the model to make use of the\r order of the sequence, we must inject some information about the relative or absolute position of the\r tokens in the sequence. To this end, we add \"positional encodings\" to the input embeddings at the\r bottoms of the encoder and decoder stacks. The positional encodings have the same dimension dmodel\r as the embeddings, so that the two can be summed. There are many choices of positional encodings,\r learned and fixed [9].\r In this work, we use sine and cosine functions of different frequencies:\r P E(pos,2i) = sin(pos/100002i/dmodel )\r P E(pos,2i+1) = cos(pos/100002i/dmodel )\r where pos is the position and i is the dimension. That is, each dimension of the positional encoding\r corresponds to a sinusoid. The wavelengths form a geometric progression from 2π to 10000 · 2π. We\r chose this function because we hypothesized it would allow the model to easily learn to attend by\r relative positions, since for any fixed offset k, P Epos+k can be represented as a linear function of\r P Epos.\r We also experimented with using learned positional embeddings [9] instead, and found that the two\r versions produced nearly identical results (see Table 3 row (E)). We chose the sinusoidal version\r because it may allow the model to extrapolate to sequence lengths longer than the ones encountered\r during training."
|
| 2077 |
+
}
|
| 2078 |
+
}],
|
| 2079 |
+
"7": [{
|
| 2080 |
+
"id": "highlight_1755776991536",
|
| 2081 |
+
"position": {
|
| 2082 |
+
"boundingRect": {
|
| 2083 |
+
"x1": 143.53334045410156,
|
| 2084 |
+
"y1": 736.7000274658203,
|
| 2085 |
+
"x2": 673.7833404541016,
|
| 2086 |
+
"y2": 964.7000122070312,
|
| 2087 |
+
"width": 816,
|
| 2088 |
+
"height": 1056,
|
| 2089 |
+
"pageNumber": 6
|
| 2090 |
+
},
|
| 2091 |
+
"rects": [
|
| 2092 |
+
{
|
| 2093 |
+
"x1": 466.51666259765625,
|
| 2094 |
+
"y1": 736.7000274658203,
|
| 2095 |
+
"x2": 673.2833251953125,
|
| 2096 |
+
"y2": 753.7000274658203,
|
| 2097 |
+
"width": 816,
|
| 2098 |
+
"height": 1056,
|
| 2099 |
+
"pageNumber": 6
|
| 2100 |
+
},
|
| 2101 |
+
{
|
| 2102 |
+
"x1": 144.0166778564453,
|
| 2103 |
+
"y1": 751.2833557128906,
|
| 2104 |
+
"x2": 279.7166748046875,
|
| 2105 |
+
"y2": 768.2833557128906,
|
| 2106 |
+
"width": 816,
|
| 2107 |
+
"height": 1056,
|
| 2108 |
+
"pageNumber": 6
|
| 2109 |
+
},
|
| 2110 |
+
{
|
| 2111 |
+
"x1": 144.0166778564453,
|
| 2112 |
+
"y1": 773.1333465576172,
|
| 2113 |
+
"x2": 673.2500152587891,
|
| 2114 |
+
"y2": 790.1333465576172,
|
| 2115 |
+
"width": 816,
|
| 2116 |
+
"height": 1056,
|
| 2117 |
+
"pageNumber": 6
|
| 2118 |
+
},
|
| 2119 |
+
{
|
| 2120 |
+
"x1": 144.0166778564453,
|
| 2121 |
+
"y1": 787.7166748046875,
|
| 2122 |
+
"x2": 611.1333160400391,
|
| 2123 |
+
"y2": 804.7166748046875,
|
| 2124 |
+
"width": 816,
|
| 2125 |
+
"height": 1056,
|
| 2126 |
+
"pageNumber": 6
|
| 2127 |
+
},
|
| 2128 |
+
{
|
| 2129 |
+
"x1": 143.60000610351562,
|
| 2130 |
+
"y1": 809.4666748046875,
|
| 2131 |
+
"x2": 673.3999938964844,
|
| 2132 |
+
"y2": 826.4666748046875,
|
| 2133 |
+
"width": 816,
|
| 2134 |
+
"height": 1056,
|
| 2135 |
+
"pageNumber": 6
|
| 2136 |
+
},
|
| 2137 |
+
{
|
| 2138 |
+
"x1": 144.0166778564453,
|
| 2139 |
+
"y1": 824.0333557128906,
|
| 2140 |
+
"x2": 673.3000030517578,
|
| 2141 |
+
"y2": 841.0333557128906,
|
| 2142 |
+
"width": 816,
|
| 2143 |
+
"height": 1056,
|
| 2144 |
+
"pageNumber": 6
|
| 2145 |
+
},
|
| 2146 |
+
{
|
| 2147 |
+
"x1": 144.0166778564453,
|
| 2148 |
+
"y1": 838.61669921875,
|
| 2149 |
+
"x2": 673.3499908447266,
|
| 2150 |
+
"y2": 855.61669921875,
|
| 2151 |
+
"width": 816,
|
| 2152 |
+
"height": 1056,
|
| 2153 |
+
"pageNumber": 6
|
| 2154 |
+
},
|
| 2155 |
+
{
|
| 2156 |
+
"x1": 144.0166778564453,
|
| 2157 |
+
"y1": 853.183349609375,
|
| 2158 |
+
"x2": 673.3000030517578,
|
| 2159 |
+
"y2": 870.183349609375,
|
| 2160 |
+
"width": 816,
|
| 2161 |
+
"height": 1056,
|
| 2162 |
+
"pageNumber": 6
|
| 2163 |
+
},
|
| 2164 |
+
{
|
| 2165 |
+
"x1": 144.0166778564453,
|
| 2166 |
+
"y1": 867.6500244140625,
|
| 2167 |
+
"x2": 672.3333282470703,
|
| 2168 |
+
"y2": 884.6500244140625,
|
| 2169 |
+
"width": 816,
|
| 2170 |
+
"height": 1056,
|
| 2171 |
+
"pageNumber": 6
|
| 2172 |
+
},
|
| 2173 |
+
{
|
| 2174 |
+
"x1": 144.0166778564453,
|
| 2175 |
+
"y1": 882.2166748046875,
|
| 2176 |
+
"x2": 673.3333282470703,
|
| 2177 |
+
"y2": 899.2166748046875,
|
| 2178 |
+
"width": 816,
|
| 2179 |
+
"height": 1056,
|
| 2180 |
+
"pageNumber": 6
|
| 2181 |
+
},
|
| 2182 |
+
{
|
| 2183 |
+
"x1": 144.0166778564453,
|
| 2184 |
+
"y1": 896.8000183105469,
|
| 2185 |
+
"x2": 254.25,
|
| 2186 |
+
"y2": 913.8000183105469,
|
| 2187 |
+
"width": 816,
|
| 2188 |
+
"height": 1056,
|
| 2189 |
+
"pageNumber": 6
|
| 2190 |
+
},
|
| 2191 |
+
{
|
| 2192 |
+
"x1": 143.53334045410156,
|
| 2193 |
+
"y1": 918.6500244140625,
|
| 2194 |
+
"x2": 673.7833404541016,
|
| 2195 |
+
"y2": 935.6500244140625,
|
| 2196 |
+
"width": 816,
|
| 2197 |
+
"height": 1056,
|
| 2198 |
+
"pageNumber": 6
|
| 2199 |
+
},
|
| 2200 |
+
{
|
| 2201 |
+
"x1": 144.0166778564453,
|
| 2202 |
+
"y1": 933.2333374023438,
|
| 2203 |
+
"x2": 672.4333190917969,
|
| 2204 |
+
"y2": 950.2333374023438,
|
| 2205 |
+
"width": 816,
|
| 2206 |
+
"height": 1056,
|
| 2207 |
+
"pageNumber": 6
|
| 2208 |
+
},
|
| 2209 |
+
{
|
| 2210 |
+
"x1": 144.0166778564453,
|
| 2211 |
+
"y1": 947.7000122070312,
|
| 2212 |
+
"x2": 673.3000030517578,
|
| 2213 |
+
"y2": 964.7000122070312,
|
| 2214 |
+
"width": 816,
|
| 2215 |
+
"height": 1056,
|
| 2216 |
+
"pageNumber": 6
|
| 2217 |
+
}
|
| 2218 |
+
]
|
| 2219 |
+
},
|
| 2220 |
+
"content": {
|
| 2221 |
+
"text": "Motivating our use of self-attention we\r consider three desiderata.\r One is the total computational complexity per layer. Another is the amount of computation that can\r be parallelized, as measured by the minimum number of sequential operations required.\r The third is the path length between long-range dependencies in the network. Learning long-range\r dependencies is a key challenge in many sequence transduction tasks. One key factor affecting the\r ability to learn such dependencies is the length of the paths forward and backward signals have to\r traverse in the network. The shorter these paths between any combination of positions in the input\r and output sequences, the easier it is to learn long-range dependencies [12]. Hence we also compare\r the maximum path length between any two input and output positions in networks composed of the\r different layer types.\r As noted in Table 1, a self-attention layer connects all positions with a constant number of sequentially\r executed operations, whereas a recurrent layer requires O(n) sequential operations. In terms of\r computational complexity, self-attention layers are faster than recurrent layers when the sequence"
|
| 2222 |
+
}
|
| 2223 |
+
}, {
|
| 2224 |
+
"id": "highlight_1755777048311",
|
| 2225 |
+
"position": {
|
| 2226 |
+
"boundingRect": {
|
| 2227 |
+
"x1": 144.0166778564453,
|
| 2228 |
+
"y1": 96.98333740234375,
|
| 2229 |
+
"x2": 672.5,
|
| 2230 |
+
"y2": 128.55001831054688,
|
| 2231 |
+
"width": 816,
|
| 2232 |
+
"height": 1056,
|
| 2233 |
+
"pageNumber": 7
|
| 2234 |
+
},
|
| 2235 |
+
"rects": [
|
| 2236 |
+
{
|
| 2237 |
+
"x1": 144.0166778564453,
|
| 2238 |
+
"y1": 96.98333740234375,
|
| 2239 |
+
"x2": 672.5,
|
| 2240 |
+
"y2": 113.98333740234375,
|
| 2241 |
+
"width": 816,
|
| 2242 |
+
"height": 1056,
|
| 2243 |
+
"pageNumber": 7
|
| 2244 |
+
},
|
| 2245 |
+
{
|
| 2246 |
+
"x1": 144.0166778564453,
|
| 2247 |
+
"y1": 111.55001831054688,
|
| 2248 |
+
"x2": 271.1000061035156,
|
| 2249 |
+
"y2": 128.55001831054688,
|
| 2250 |
+
"width": 816,
|
| 2251 |
+
"height": 1056,
|
| 2252 |
+
"pageNumber": 7
|
| 2253 |
+
}
|
| 2254 |
+
]
|
| 2255 |
+
},
|
| 2256 |
+
"content": {
|
| 2257 |
+
"text": "length n is smaller than the representation dimensionality d, which is most often the case with\r sentence representation"
|
| 2258 |
+
}
|
| 2259 |
+
}]
|
| 2260 |
+
}
|