rawanessam commited on
Commit
32fa779
·
verified ·
1 Parent(s): 5412499

Upload 25 files

Browse files
LICENSE ADDED
@@ -0,0 +1,674 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ GNU GENERAL PUBLIC LICENSE
2
+ Version 3, 29 June 2007
3
+
4
+ Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
5
+ Everyone is permitted to copy and distribute verbatim copies
6
+ of this license document, but changing it is not allowed.
7
+
8
+ Preamble
9
+
10
+ The GNU General Public License is a free, copyleft license for
11
+ software and other kinds of works.
12
+
13
+ The licenses for most software and other practical works are designed
14
+ to take away your freedom to share and change the works. By contrast,
15
+ the GNU General Public License is intended to guarantee your freedom to
16
+ share and change all versions of a program--to make sure it remains free
17
+ software for all its users. We, the Free Software Foundation, use the
18
+ GNU General Public License for most of our software; it applies also to
19
+ any other work released this way by its authors. You can apply it to
20
+ your programs, too.
21
+
22
+ When we speak of free software, we are referring to freedom, not
23
+ price. Our General Public Licenses are designed to make sure that you
24
+ have the freedom to distribute copies of free software (and charge for
25
+ them if you wish), that you receive source code or can get it if you
26
+ want it, that you can change the software or use pieces of it in new
27
+ free programs, and that you know you can do these things.
28
+
29
+ To protect your rights, we need to prevent others from denying you
30
+ these rights or asking you to surrender the rights. Therefore, you have
31
+ certain responsibilities if you distribute copies of the software, or if
32
+ you modify it: responsibilities to respect the freedom of others.
33
+
34
+ For example, if you distribute copies of such a program, whether
35
+ gratis or for a fee, you must pass on to the recipients the same
36
+ freedoms that you received. You must make sure that they, too, receive
37
+ or can get the source code. And you must show them these terms so they
38
+ know their rights.
39
+
40
+ Developers that use the GNU GPL protect your rights with two steps:
41
+ (1) assert copyright on the software, and (2) offer you this License
42
+ giving you legal permission to copy, distribute and/or modify it.
43
+
44
+ For the developers' and authors' protection, the GPL clearly explains
45
+ that there is no warranty for this free software. For both users' and
46
+ authors' sake, the GPL requires that modified versions be marked as
47
+ changed, so that their problems will not be attributed erroneously to
48
+ authors of previous versions.
49
+
50
+ Some devices are designed to deny users access to install or run
51
+ modified versions of the software inside them, although the manufacturer
52
+ can do so. This is fundamentally incompatible with the aim of
53
+ protecting users' freedom to change the software. The systematic
54
+ pattern of such abuse occurs in the area of products for individuals to
55
+ use, which is precisely where it is most unacceptable. Therefore, we
56
+ have designed this version of the GPL to prohibit the practice for those
57
+ products. If such problems arise substantially in other domains, we
58
+ stand ready to extend this provision to those domains in future versions
59
+ of the GPL, as needed to protect the freedom of users.
60
+
61
+ Finally, every program is threatened constantly by software patents.
62
+ States should not allow patents to restrict development and use of
63
+ software on general-purpose computers, but in those that do, we wish to
64
+ avoid the special danger that patents applied to a free program could
65
+ make it effectively proprietary. To prevent this, the GPL assures that
66
+ patents cannot be used to render the program non-free.
67
+
68
+ The precise terms and conditions for copying, distribution and
69
+ modification follow.
70
+
71
+ TERMS AND CONDITIONS
72
+
73
+ 0. Definitions.
74
+
75
+ "This License" refers to version 3 of the GNU General Public License.
76
+
77
+ "Copyright" also means copyright-like laws that apply to other kinds of
78
+ works, such as semiconductor masks.
79
+
80
+ "The Program" refers to any copyrightable work licensed under this
81
+ License. Each licensee is addressed as "you". "Licensees" and
82
+ "recipients" may be individuals or organizations.
83
+
84
+ To "modify" a work means to copy from or adapt all or part of the work
85
+ in a fashion requiring copyright permission, other than the making of an
86
+ exact copy. The resulting work is called a "modified version" of the
87
+ earlier work or a work "based on" the earlier work.
88
+
89
+ A "covered work" means either the unmodified Program or a work based
90
+ on the Program.
91
+
92
+ To "propagate" a work means to do anything with it that, without
93
+ permission, would make you directly or secondarily liable for
94
+ infringement under applicable copyright law, except executing it on a
95
+ computer or modifying a private copy. Propagation includes copying,
96
+ distribution (with or without modification), making available to the
97
+ public, and in some countries other activities as well.
98
+
99
+ To "convey" a work means any kind of propagation that enables other
100
+ parties to make or receive copies. Mere interaction with a user through
101
+ a computer network, with no transfer of a copy, is not conveying.
102
+
103
+ An interactive user interface displays "Appropriate Legal Notices"
104
+ to the extent that it includes a convenient and prominently visible
105
+ feature that (1) displays an appropriate copyright notice, and (2)
106
+ tells the user that there is no warranty for the work (except to the
107
+ extent that warranties are provided), that licensees may convey the
108
+ work under this License, and how to view a copy of this License. If
109
+ the interface presents a list of user commands or options, such as a
110
+ menu, a prominent item in the list meets this criterion.
111
+
112
+ 1. Source Code.
113
+
114
+ The "source code" for a work means the preferred form of the work
115
+ for making modifications to it. "Object code" means any non-source
116
+ form of a work.
117
+
118
+ A "Standard Interface" means an interface that either is an official
119
+ standard defined by a recognized standards body, or, in the case of
120
+ interfaces specified for a particular programming language, one that
121
+ is widely used among developers working in that language.
122
+
123
+ The "System Libraries" of an executable work include anything, other
124
+ than the work as a whole, that (a) is included in the normal form of
125
+ packaging a Major Component, but which is not part of that Major
126
+ Component, and (b) serves only to enable use of the work with that
127
+ Major Component, or to implement a Standard Interface for which an
128
+ implementation is available to the public in source code form. A
129
+ "Major Component", in this context, means a major essential component
130
+ (kernel, window system, and so on) of the specific operating system
131
+ (if any) on which the executable work runs, or a compiler used to
132
+ produce the work, or an object code interpreter used to run it.
133
+
134
+ The "Corresponding Source" for a work in object code form means all
135
+ the source code needed to generate, install, and (for an executable
136
+ work) run the object code and to modify the work, including scripts to
137
+ control those activities. However, it does not include the work's
138
+ System Libraries, or general-purpose tools or generally available free
139
+ programs which are used unmodified in performing those activities but
140
+ which are not part of the work. For example, Corresponding Source
141
+ includes interface definition files associated with source files for
142
+ the work, and the source code for shared libraries and dynamically
143
+ linked subprograms that the work is specifically designed to require,
144
+ such as by intimate data communication or control flow between those
145
+ subprograms and other parts of the work.
146
+
147
+ The Corresponding Source need not include anything that users
148
+ can regenerate automatically from other parts of the Corresponding
149
+ Source.
150
+
151
+ The Corresponding Source for a work in source code form is that
152
+ same work.
153
+
154
+ 2. Basic Permissions.
155
+
156
+ All rights granted under this License are granted for the term of
157
+ copyright on the Program, and are irrevocable provided the stated
158
+ conditions are met. This License explicitly affirms your unlimited
159
+ permission to run the unmodified Program. The output from running a
160
+ covered work is covered by this License only if the output, given its
161
+ content, constitutes a covered work. This License acknowledges your
162
+ rights of fair use or other equivalent, as provided by copyright law.
163
+
164
+ You may make, run and propagate covered works that you do not
165
+ convey, without conditions so long as your license otherwise remains
166
+ in force. You may convey covered works to others for the sole purpose
167
+ of having them make modifications exclusively for you, or provide you
168
+ with facilities for running those works, provided that you comply with
169
+ the terms of this License in conveying all material for which you do
170
+ not control copyright. Those thus making or running the covered works
171
+ for you must do so exclusively on your behalf, under your direction
172
+ and control, on terms that prohibit them from making any copies of
173
+ your copyrighted material outside their relationship with you.
174
+
175
+ Conveying under any other circumstances is permitted solely under
176
+ the conditions stated below. Sublicensing is not allowed; section 10
177
+ makes it unnecessary.
178
+
179
+ 3. Protecting Users' Legal Rights From Anti-Circumvention Law.
180
+
181
+ No covered work shall be deemed part of an effective technological
182
+ measure under any applicable law fulfilling obligations under article
183
+ 11 of the WIPO copyright treaty adopted on 20 December 1996, or
184
+ similar laws prohibiting or restricting circumvention of such
185
+ measures.
186
+
187
+ When you convey a covered work, you waive any legal power to forbid
188
+ circumvention of technological measures to the extent such circumvention
189
+ is effected by exercising rights under this License with respect to
190
+ the covered work, and you disclaim any intention to limit operation or
191
+ modification of the work as a means of enforcing, against the work's
192
+ users, your or third parties' legal rights to forbid circumvention of
193
+ technological measures.
194
+
195
+ 4. Conveying Verbatim Copies.
196
+
197
+ You may convey verbatim copies of the Program's source code as you
198
+ receive it, in any medium, provided that you conspicuously and
199
+ appropriately publish on each copy an appropriate copyright notice;
200
+ keep intact all notices stating that this License and any
201
+ non-permissive terms added in accord with section 7 apply to the code;
202
+ keep intact all notices of the absence of any warranty; and give all
203
+ recipients a copy of this License along with the Program.
204
+
205
+ You may charge any price or no price for each copy that you convey,
206
+ and you may offer support or warranty protection for a fee.
207
+
208
+ 5. Conveying Modified Source Versions.
209
+
210
+ You may convey a work based on the Program, or the modifications to
211
+ produce it from the Program, in the form of source code under the
212
+ terms of section 4, provided that you also meet all of these conditions:
213
+
214
+ a) The work must carry prominent notices stating that you modified
215
+ it, and giving a relevant date.
216
+
217
+ b) The work must carry prominent notices stating that it is
218
+ released under this License and any conditions added under section
219
+ 7. This requirement modifies the requirement in section 4 to
220
+ "keep intact all notices".
221
+
222
+ c) You must license the entire work, as a whole, under this
223
+ License to anyone who comes into possession of a copy. This
224
+ License will therefore apply, along with any applicable section 7
225
+ additional terms, to the whole of the work, and all its parts,
226
+ regardless of how they are packaged. This License gives no
227
+ permission to license the work in any other way, but it does not
228
+ invalidate such permission if you have separately received it.
229
+
230
+ d) If the work has interactive user interfaces, each must display
231
+ Appropriate Legal Notices; however, if the Program has interactive
232
+ interfaces that do not display Appropriate Legal Notices, your
233
+ work need not make them do so.
234
+
235
+ A compilation of a covered work with other separate and independent
236
+ works, which are not by their nature extensions of the covered work,
237
+ and which are not combined with it such as to form a larger program,
238
+ in or on a volume of a storage or distribution medium, is called an
239
+ "aggregate" if the compilation and its resulting copyright are not
240
+ used to limit the access or legal rights of the compilation's users
241
+ beyond what the individual works permit. Inclusion of a covered work
242
+ in an aggregate does not cause this License to apply to the other
243
+ parts of the aggregate.
244
+
245
+ 6. Conveying Non-Source Forms.
246
+
247
+ You may convey a covered work in object code form under the terms
248
+ of sections 4 and 5, provided that you also convey the
249
+ machine-readable Corresponding Source under the terms of this License,
250
+ in one of these ways:
251
+
252
+ a) Convey the object code in, or embodied in, a physical product
253
+ (including a physical distribution medium), accompanied by the
254
+ Corresponding Source fixed on a durable physical medium
255
+ customarily used for software interchange.
256
+
257
+ b) Convey the object code in, or embodied in, a physical product
258
+ (including a physical distribution medium), accompanied by a
259
+ written offer, valid for at least three years and valid for as
260
+ long as you offer spare parts or customer support for that product
261
+ model, to give anyone who possesses the object code either (1) a
262
+ copy of the Corresponding Source for all the software in the
263
+ product that is covered by this License, on a durable physical
264
+ medium customarily used for software interchange, for a price no
265
+ more than your reasonable cost of physically performing this
266
+ conveying of source, or (2) access to copy the
267
+ Corresponding Source from a network server at no charge.
268
+
269
+ c) Convey individual copies of the object code with a copy of the
270
+ written offer to provide the Corresponding Source. This
271
+ alternative is allowed only occasionally and noncommercially, and
272
+ only if you received the object code with such an offer, in accord
273
+ with subsection 6b.
274
+
275
+ d) Convey the object code by offering access from a designated
276
+ place (gratis or for a charge), and offer equivalent access to the
277
+ Corresponding Source in the same way through the same place at no
278
+ further charge. You need not require recipients to copy the
279
+ Corresponding Source along with the object code. If the place to
280
+ copy the object code is a network server, the Corresponding Source
281
+ may be on a different server (operated by you or a third party)
282
+ that supports equivalent copying facilities, provided you maintain
283
+ clear directions next to the object code saying where to find the
284
+ Corresponding Source. Regardless of what server hosts the
285
+ Corresponding Source, you remain obligated to ensure that it is
286
+ available for as long as needed to satisfy these requirements.
287
+
288
+ e) Convey the object code using peer-to-peer transmission, provided
289
+ you inform other peers where the object code and Corresponding
290
+ Source of the work are being offered to the general public at no
291
+ charge under subsection 6d.
292
+
293
+ A separable portion of the object code, whose source code is excluded
294
+ from the Corresponding Source as a System Library, need not be
295
+ included in conveying the object code work.
296
+
297
+ A "User Product" is either (1) a "consumer product", which means any
298
+ tangible personal property which is normally used for personal, family,
299
+ or household purposes, or (2) anything designed or sold for incorporation
300
+ into a dwelling. In determining whether a product is a consumer product,
301
+ doubtful cases shall be resolved in favor of coverage. For a particular
302
+ product received by a particular user, "normally used" refers to a
303
+ typical or common use of that class of product, regardless of the status
304
+ of the particular user or of the way in which the particular user
305
+ actually uses, or expects or is expected to use, the product. A product
306
+ is a consumer product regardless of whether the product has substantial
307
+ commercial, industrial or non-consumer uses, unless such uses represent
308
+ the only significant mode of use of the product.
309
+
310
+ "Installation Information" for a User Product means any methods,
311
+ procedures, authorization keys, or other information required to install
312
+ and execute modified versions of a covered work in that User Product from
313
+ a modified version of its Corresponding Source. The information must
314
+ suffice to ensure that the continued functioning of the modified object
315
+ code is in no case prevented or interfered with solely because
316
+ modification has been made.
317
+
318
+ If you convey an object code work under this section in, or with, or
319
+ specifically for use in, a User Product, and the conveying occurs as
320
+ part of a transaction in which the right of possession and use of the
321
+ User Product is transferred to the recipient in perpetuity or for a
322
+ fixed term (regardless of how the transaction is characterized), the
323
+ Corresponding Source conveyed under this section must be accompanied
324
+ by the Installation Information. But this requirement does not apply
325
+ if neither you nor any third party retains the ability to install
326
+ modified object code on the User Product (for example, the work has
327
+ been installed in ROM).
328
+
329
+ The requirement to provide Installation Information does not include a
330
+ requirement to continue to provide support service, warranty, or updates
331
+ for a work that has been modified or installed by the recipient, or for
332
+ the User Product in which it has been modified or installed. Access to a
333
+ network may be denied when the modification itself materially and
334
+ adversely affects the operation of the network or violates the rules and
335
+ protocols for communication across the network.
336
+
337
+ Corresponding Source conveyed, and Installation Information provided,
338
+ in accord with this section must be in a format that is publicly
339
+ documented (and with an implementation available to the public in
340
+ source code form), and must require no special password or key for
341
+ unpacking, reading or copying.
342
+
343
+ 7. Additional Terms.
344
+
345
+ "Additional permissions" are terms that supplement the terms of this
346
+ License by making exceptions from one or more of its conditions.
347
+ Additional permissions that are applicable to the entire Program shall
348
+ be treated as though they were included in this License, to the extent
349
+ that they are valid under applicable law. If additional permissions
350
+ apply only to part of the Program, that part may be used separately
351
+ under those permissions, but the entire Program remains governed by
352
+ this License without regard to the additional permissions.
353
+
354
+ When you convey a copy of a covered work, you may at your option
355
+ remove any additional permissions from that copy, or from any part of
356
+ it. (Additional permissions may be written to require their own
357
+ removal in certain cases when you modify the work.) You may place
358
+ additional permissions on material, added by you to a covered work,
359
+ for which you have or can give appropriate copyright permission.
360
+
361
+ Notwithstanding any other provision of this License, for material you
362
+ add to a covered work, you may (if authorized by the copyright holders of
363
+ that material) supplement the terms of this License with terms:
364
+
365
+ a) Disclaiming warranty or limiting liability differently from the
366
+ terms of sections 15 and 16 of this License; or
367
+
368
+ b) Requiring preservation of specified reasonable legal notices or
369
+ author attributions in that material or in the Appropriate Legal
370
+ Notices displayed by works containing it; or
371
+
372
+ c) Prohibiting misrepresentation of the origin of that material, or
373
+ requiring that modified versions of such material be marked in
374
+ reasonable ways as different from the original version; or
375
+
376
+ d) Limiting the use for publicity purposes of names of licensors or
377
+ authors of the material; or
378
+
379
+ e) Declining to grant rights under trademark law for use of some
380
+ trade names, trademarks, or service marks; or
381
+
382
+ f) Requiring indemnification of licensors and authors of that
383
+ material by anyone who conveys the material (or modified versions of
384
+ it) with contractual assumptions of liability to the recipient, for
385
+ any liability that these contractual assumptions directly impose on
386
+ those licensors and authors.
387
+
388
+ All other non-permissive additional terms are considered "further
389
+ restrictions" within the meaning of section 10. If the Program as you
390
+ received it, or any part of it, contains a notice stating that it is
391
+ governed by this License along with a term that is a further
392
+ restriction, you may remove that term. If a license document contains
393
+ a further restriction but permits relicensing or conveying under this
394
+ License, you may add to a covered work material governed by the terms
395
+ of that license document, provided that the further restriction does
396
+ not survive such relicensing or conveying.
397
+
398
+ If you add terms to a covered work in accord with this section, you
399
+ must place, in the relevant source files, a statement of the
400
+ additional terms that apply to those files, or a notice indicating
401
+ where to find the applicable terms.
402
+
403
+ Additional terms, permissive or non-permissive, may be stated in the
404
+ form of a separately written license, or stated as exceptions;
405
+ the above requirements apply either way.
406
+
407
+ 8. Termination.
408
+
409
+ You may not propagate or modify a covered work except as expressly
410
+ provided under this License. Any attempt otherwise to propagate or
411
+ modify it is void, and will automatically terminate your rights under
412
+ this License (including any patent licenses granted under the third
413
+ paragraph of section 11).
414
+
415
+ However, if you cease all violation of this License, then your
416
+ license from a particular copyright holder is reinstated (a)
417
+ provisionally, unless and until the copyright holder explicitly and
418
+ finally terminates your license, and (b) permanently, if the copyright
419
+ holder fails to notify you of the violation by some reasonable means
420
+ prior to 60 days after the cessation.
421
+
422
+ Moreover, your license from a particular copyright holder is
423
+ reinstated permanently if the copyright holder notifies you of the
424
+ violation by some reasonable means, this is the first time you have
425
+ received notice of violation of this License (for any work) from that
426
+ copyright holder, and you cure the violation prior to 30 days after
427
+ your receipt of the notice.
428
+
429
+ Termination of your rights under this section does not terminate the
430
+ licenses of parties who have received copies or rights from you under
431
+ this License. If your rights have been terminated and not permanently
432
+ reinstated, you do not qualify to receive new licenses for the same
433
+ material under section 10.
434
+
435
+ 9. Acceptance Not Required for Having Copies.
436
+
437
+ You are not required to accept this License in order to receive or
438
+ run a copy of the Program. Ancillary propagation of a covered work
439
+ occurring solely as a consequence of using peer-to-peer transmission
440
+ to receive a copy likewise does not require acceptance. However,
441
+ nothing other than this License grants you permission to propagate or
442
+ modify any covered work. These actions infringe copyright if you do
443
+ not accept this License. Therefore, by modifying or propagating a
444
+ covered work, you indicate your acceptance of this License to do so.
445
+
446
+ 10. Automatic Licensing of Downstream Recipients.
447
+
448
+ Each time you convey a covered work, the recipient automatically
449
+ receives a license from the original licensors, to run, modify and
450
+ propagate that work, subject to this License. You are not responsible
451
+ for enforcing compliance by third parties with this License.
452
+
453
+ An "entity transaction" is a transaction transferring control of an
454
+ organization, or substantially all assets of one, or subdividing an
455
+ organization, or merging organizations. If propagation of a covered
456
+ work results from an entity transaction, each party to that
457
+ transaction who receives a copy of the work also receives whatever
458
+ licenses to the work the party's predecessor in interest had or could
459
+ give under the previous paragraph, plus a right to possession of the
460
+ Corresponding Source of the work from the predecessor in interest, if
461
+ the predecessor has it or can get it with reasonable efforts.
462
+
463
+ You may not impose any further restrictions on the exercise of the
464
+ rights granted or affirmed under this License. For example, you may
465
+ not impose a license fee, royalty, or other charge for exercise of
466
+ rights granted under this License, and you may not initiate litigation
467
+ (including a cross-claim or counterclaim in a lawsuit) alleging that
468
+ any patent claim is infringed by making, using, selling, offering for
469
+ sale, or importing the Program or any portion of it.
470
+
471
+ 11. Patents.
472
+
473
+ A "contributor" is a copyright holder who authorizes use under this
474
+ License of the Program or a work on which the Program is based. The
475
+ work thus licensed is called the contributor's "contributor version".
476
+
477
+ A contributor's "essential patent claims" are all patent claims
478
+ owned or controlled by the contributor, whether already acquired or
479
+ hereafter acquired, that would be infringed by some manner, permitted
480
+ by this License, of making, using, or selling its contributor version,
481
+ but do not include claims that would be infringed only as a
482
+ consequence of further modification of the contributor version. For
483
+ purposes of this definition, "control" includes the right to grant
484
+ patent sublicenses in a manner consistent with the requirements of
485
+ this License.
486
+
487
+ Each contributor grants you a non-exclusive, worldwide, royalty-free
488
+ patent license under the contributor's essential patent claims, to
489
+ make, use, sell, offer for sale, import and otherwise run, modify and
490
+ propagate the contents of its contributor version.
491
+
492
+ In the following three paragraphs, a "patent license" is any express
493
+ agreement or commitment, however denominated, not to enforce a patent
494
+ (such as an express permission to practice a patent or covenant not to
495
+ sue for patent infringement). To "grant" such a patent license to a
496
+ party means to make such an agreement or commitment not to enforce a
497
+ patent against the party.
498
+
499
+ If you convey a covered work, knowingly relying on a patent license,
500
+ and the Corresponding Source of the work is not available for anyone
501
+ to copy, free of charge and under the terms of this License, through a
502
+ publicly available network server or other readily accessible means,
503
+ then you must either (1) cause the Corresponding Source to be so
504
+ available, or (2) arrange to deprive yourself of the benefit of the
505
+ patent license for this particular work, or (3) arrange, in a manner
506
+ consistent with the requirements of this License, to extend the patent
507
+ license to downstream recipients. "Knowingly relying" means you have
508
+ actual knowledge that, but for the patent license, your conveying the
509
+ covered work in a country, or your recipient's use of the covered work
510
+ in a country, would infringe one or more identifiable patents in that
511
+ country that you have reason to believe are valid.
512
+
513
+ If, pursuant to or in connection with a single transaction or
514
+ arrangement, you convey, or propagate by procuring conveyance of, a
515
+ covered work, and grant a patent license to some of the parties
516
+ receiving the covered work authorizing them to use, propagate, modify
517
+ or convey a specific copy of the covered work, then the patent license
518
+ you grant is automatically extended to all recipients of the covered
519
+ work and works based on it.
520
+
521
+ A patent license is "discriminatory" if it does not include within
522
+ the scope of its coverage, prohibits the exercise of, or is
523
+ conditioned on the non-exercise of one or more of the rights that are
524
+ specifically granted under this License. You may not convey a covered
525
+ work if you are a party to an arrangement with a third party that is
526
+ in the business of distributing software, under which you make payment
527
+ to the third party based on the extent of your activity of conveying
528
+ the work, and under which the third party grants, to any of the
529
+ parties who would receive the covered work from you, a discriminatory
530
+ patent license (a) in connection with copies of the covered work
531
+ conveyed by you (or copies made from those copies), or (b) primarily
532
+ for and in connection with specific products or compilations that
533
+ contain the covered work, unless you entered into that arrangement,
534
+ or that patent license was granted, prior to 28 March 2007.
535
+
536
+ Nothing in this License shall be construed as excluding or limiting
537
+ any implied license or other defenses to infringement that may
538
+ otherwise be available to you under applicable patent law.
539
+
540
+ 12. No Surrender of Others' Freedom.
541
+
542
+ If conditions are imposed on you (whether by court order, agreement or
543
+ otherwise) that contradict the conditions of this License, they do not
544
+ excuse you from the conditions of this License. If you cannot convey a
545
+ covered work so as to satisfy simultaneously your obligations under this
546
+ License and any other pertinent obligations, then as a consequence you may
547
+ not convey it at all. For example, if you agree to terms that obligate you
548
+ to collect a royalty for further conveying from those to whom you convey
549
+ the Program, the only way you could satisfy both those terms and this
550
+ License would be to refrain entirely from conveying the Program.
551
+
552
+ 13. Use with the GNU Affero General Public License.
553
+
554
+ Notwithstanding any other provision of this License, you have
555
+ permission to link or combine any covered work with a work licensed
556
+ under version 3 of the GNU Affero General Public License into a single
557
+ combined work, and to convey the resulting work. The terms of this
558
+ License will continue to apply to the part which is the covered work,
559
+ but the special requirements of the GNU Affero General Public License,
560
+ section 13, concerning interaction through a network will apply to the
561
+ combination as such.
562
+
563
+ 14. Revised Versions of this License.
564
+
565
+ The Free Software Foundation may publish revised and/or new versions of
566
+ the GNU General Public License from time to time. Such new versions will
567
+ be similar in spirit to the present version, but may differ in detail to
568
+ address new problems or concerns.
569
+
570
+ Each version is given a distinguishing version number. If the
571
+ Program specifies that a certain numbered version of the GNU General
572
+ Public License "or any later version" applies to it, you have the
573
+ option of following the terms and conditions either of that numbered
574
+ version or of any later version published by the Free Software
575
+ Foundation. If the Program does not specify a version number of the
576
+ GNU General Public License, you may choose any version ever published
577
+ by the Free Software Foundation.
578
+
579
+ If the Program specifies that a proxy can decide which future
580
+ versions of the GNU General Public License can be used, that proxy's
581
+ public statement of acceptance of a version permanently authorizes you
582
+ to choose that version for the Program.
583
+
584
+ Later license versions may give you additional or different
585
+ permissions. However, no additional obligations are imposed on any
586
+ author or copyright holder as a result of your choosing to follow a
587
+ later version.
588
+
589
+ 15. Disclaimer of Warranty.
590
+
591
+ THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
592
+ APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
593
+ HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
594
+ OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
595
+ THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
596
+ PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
597
+ IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
598
+ ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
599
+
600
+ 16. Limitation of Liability.
601
+
602
+ IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
603
+ WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
604
+ THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
605
+ GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
606
+ USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
607
+ DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
608
+ PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
609
+ EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
610
+ SUCH DAMAGES.
611
+
612
+ 17. Interpretation of Sections 15 and 16.
613
+
614
+ If the disclaimer of warranty and limitation of liability provided
615
+ above cannot be given local legal effect according to their terms,
616
+ reviewing courts shall apply local law that most closely approximates
617
+ an absolute waiver of all civil liability in connection with the
618
+ Program, unless a warranty or assumption of liability accompanies a
619
+ copy of the Program in return for a fee.
620
+
621
+ END OF TERMS AND CONDITIONS
622
+
623
+ How to Apply These Terms to Your New Programs
624
+
625
+ If you develop a new program, and you want it to be of the greatest
626
+ possible use to the public, the best way to achieve this is to make it
627
+ free software which everyone can redistribute and change under these terms.
628
+
629
+ To do so, attach the following notices to the program. It is safest
630
+ to attach them to the start of each source file to most effectively
631
+ state the exclusion of warranty; and each file should have at least
632
+ the "copyright" line and a pointer to where the full notice is found.
633
+
634
+ <one line to give the program's name and a brief idea of what it does.>
635
+ Copyright (C) <year> <name of author>
636
+
637
+ This program is free software: you can redistribute it and/or modify
638
+ it under the terms of the GNU General Public License as published by
639
+ the Free Software Foundation, either version 3 of the License, or
640
+ (at your option) any later version.
641
+
642
+ This program is distributed in the hope that it will be useful,
643
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
644
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
645
+ GNU General Public License for more details.
646
+
647
+ You should have received a copy of the GNU General Public License
648
+ along with this program. If not, see <https://www.gnu.org/licenses/>.
649
+
650
+ Also add information on how to contact you by electronic and paper mail.
651
+
652
+ If the program does terminal interaction, make it output a short
653
+ notice like this when it starts in an interactive mode:
654
+
655
+ <program> Copyright (C) <year> <name of author>
656
+ This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
657
+ This is free software, and you are welcome to redistribute it
658
+ under certain conditions; type `show c' for details.
659
+
660
+ The hypothetical commands `show w' and `show c' should show the appropriate
661
+ parts of the General Public License. Of course, your program's commands
662
+ might be different; for a GUI interface, you would use an "about box".
663
+
664
+ You should also get your employer (if you work as a programmer) or school,
665
+ if any, to sign a "copyright disclaimer" for the program, if necessary.
666
+ For more information on this, and how to apply and follow the GNU GPL, see
667
+ <https://www.gnu.org/licenses/>.
668
+
669
+ The GNU General Public License does not permit incorporating your program
670
+ into proprietary programs. If your program is a subroutine library, you
671
+ may consider it more useful to permit linking proprietary applications with
672
+ the library. If this is what you want to do, use the GNU Lesser General
673
+ Public License instead of this License. But first, please read
674
+ <https://www.gnu.org/licenses/why-not-lgpl.html>.
README.md CHANGED
@@ -1,12 +1,153 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
- title: DeepFloorPlan
3
- emoji: 📈
4
- colorFrom: purple
5
- colorTo: green
6
- sdk: gradio
7
- sdk_version: 5.36.2
8
- app_file: app.py
9
- pinned: false
10
  ---
11
 
12
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Deep Floor Plan Recognition using a Multi-task Network with Room-boundary-Guided Attention
2
+ By Zhiliang ZENG, Xianzhi LI, Ying Kin Yu, and Chi-Wing Fu
3
+
4
+ [2021/07/26: updated download link]
5
+
6
+ [2019/08/28: updated train/test/score code & dataset]
7
+
8
+ [2019/07/29: updated demo code & pretrained model]
9
+
10
+ ## Introduction
11
+
12
+ This repository contains the code & annotation data for our ICCV 2019 paper: ['Deep Floor Plan Recognition Using a Multi-Task Network with Room-Boundary-Guided Attention'](https://arxiv.org/abs/1908.11025). In this paper, we present a new method for recognizing floor plan elements by exploring the spatial relationship between floor plan elements, model a hierarchy of floor plan elements, and design a multi-task network to learn to recognize room-boundary and room-type elements in floor plans.
13
+
14
+ ## Requirements
15
+
16
+ - Python 3.7+
17
+ - See `requirements.txt` for all dependencies (TensorFlow 1.x, Pillow, imageio, gradio, etc.)
18
+
19
+ Our code has been tested by using tensorflow-gpu==1.10.1 & OpenCV==3.1.0. We used Nvidia Titan Xp GPU with CUDA 9.0 installed.
20
+
21
+ ## Python packages
22
+
23
+ - [numpy]
24
+ - [scipy]
25
+ - [Pillow]
26
+ - [matplotlib]
27
+
28
+ ## Data
29
+
30
+ We share all our annotations and train-test split file [here](https://mycuhk-my.sharepoint.com/:f:/g/personal/1155052510_link_cuhk_edu_hk/EseSIeHQgPxArPlNpGdVp38BIjUg70jMiAO-w4f3s8B_dg?e=UXKbYO). Or download the annotation using the link in file "dataset/download_links.txt". The additional round plan is included in the annotations.
31
+
32
+ Our annotations are saved as png format. The name with suffixes "\_wall.png", "\_close.png" and "\_room.png" are denoted "wall", "door & window" and "room types" label, respectively. We used these labels to train our multi-task network.
33
+
34
+ The name with suffixes "\_close_wall.png" is the combination of "wall", "door & window" label. We don't use this label in our paper, but maybe useful for other tasks.
35
+
36
+ The name with suffixes "\_multi.png" is the combination of all the labels. We used this kind of label to retrain the general segmentation network.
37
+
38
+ We also provide our training data on R3D dataset in "tfrecord" format, which can improve the loading speed during training.
39
+
40
+ To create the "tfrecord" training set, please refer to the example code in "utils/create_tfrecord.py"
41
+
42
+ All the raw floor plan image please refer to the following two links:
43
+
44
+ - R2V: <https://github.com/art-programmer/FloorplanTransformation.git>
45
+ - R3D: <http://www.cs.toronto.edu/~fidler/projects/rent3D.html>
46
+
47
+ ## Usage
48
+
49
+ To use our demo code, please first download the pretrained model, find the link in "pretrained/download_links.txt" file, unzip and put it into "pretrained" folder, then run
50
+
51
+ ```bash
52
+ python demo.py --im_path=./demo/45719584.jpg
53
+ ```
54
+
55
+ To train the network, simply run
56
+
57
+ ```bash
58
+ python main.py --pharse=Train
59
+ ```
60
+
61
+ Run the following command to generate network outputs, all results are saved as png format.
62
+
63
+ ```bash
64
+ python main.py --pharse=Test
65
+ ```
66
+
67
+ To compute the evaluation metrics, please first inference the results, then simply run
68
+
69
+ ```bash
70
+ python scores.py --dataset=R3D
71
+ ```
72
+
73
+ To use our post-processing method, please first inference the results, then simply run
74
+
75
+ ```bash
76
+ python postprocess.py
77
+ ```
78
+
79
+ or
80
+
81
+ ```bash
82
+ python postprocess.py --result_dir=./[result_folder_path]
83
+ ```
84
+
85
+ ## Citation
86
+
87
+ If you find our work useful in your research, please consider citing:
88
+
89
+ ---
90
+
91
+ @InProceedings{zlzeng2019deepfloor,
92
+ author = {Zhiliang ZENG, Xianzhi LI, Ying Kin Yu, and Chi-Wing Fu},
93
+ title = {Deep Floor Plan Recognition using a Multi-task Network with Room-boundary-Guided Attention},
94
+ booktitle = {IEEE International Conference on Computer Vision (ICCV)},
95
+ year = {2019}
96
+ }
97
+
98
+ ---
99
+
100
+ ---
101
+
102
+ ## Quick Start with Gradio (Python 3)
103
+
104
+ 1. Install dependencies:
105
+
106
+ ```bash
107
+ pip install -r requirements.txt
108
+ ```
109
+
110
+ 2. Download the pretrained model (see `pretrained/download_links.txt`) and place it in the `pretrained` folder.
111
+
112
+ 3. Run the Gradio app:
113
+
114
+ ```bash
115
+ python app.py
116
+ ```
117
+
118
+ This will launch a web interface where you can upload a floorplan image and view the predicted segmentation.
119
+
120
  ---
121
+
122
+ ## Deploy on Hugging Face Spaces
123
+
124
+ - Upload the repository (with `app.py`, `deepfloorplan_inference.py`, and `requirements.txt`) to your Hugging Face Space.
125
+ - Make sure the `pretrained` model weights are included or downloaded in the Space.
126
+ - The Gradio app will be served automatically.
127
+
 
128
  ---
129
 
130
+ ## API Usage with Hugging Face Inference Endpoints
131
+
132
+ - Deploy the repository to a Hugging Face Inference Endpoint.
133
+ - The endpoint will use `api_inference.py` and expose a `predict` function.
134
+ - Example usage (Python):
135
+
136
+ ```python
137
+ import requests
138
+ from PIL import Image
139
+ import io
140
+
141
+ # Replace with your endpoint URL and token
142
+ API_URL = 'https://api-inference.huggingface.co/models/your-username/your-model'
143
+ headers = {"Authorization": f"Bearer YOUR_HF_TOKEN"}
144
+
145
+ image = Image.open('your_image.jpg')
146
+ buffer = io.BytesIO()
147
+ image.save(buffer, format='PNG')
148
+ response = requests.post(API_URL, headers=headers, files={"image": buffer.getvalue()})
149
+ result = Image.open(io.BytesIO(response.content))
150
+ result.show()
151
+ ```
152
+
153
+ ---
api_inference.py ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from PIL import Image
2
+ import numpy as np
3
+ from deepfloorplan_inference import DeepFloorPlanModel
4
+
5
+ class EndpointModel:
6
+ def __init__(self):
7
+ self.model = DeepFloorPlanModel(model_dir='pretrained')
8
+
9
+ def __call__(self, image):
10
+ # image: PIL Image or numpy array
11
+ if isinstance(image, np.ndarray):
12
+ image = Image.fromarray(image)
13
+ result = self.model.predict(image)
14
+ return Image.fromarray(result.astype(np.uint8))
15
+
16
+ # For Hugging Face Inference Endpoints
17
+ model = EndpointModel()
18
+
19
+ def predict(image):
20
+ return model(image)
app.py ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import gradio as gr
2
+ import numpy as np
3
+ from PIL import Image
4
+ from deepfloorplan_inference import DeepFloorPlanModel
5
+
6
+ # Load model once at startup
7
+ model = DeepFloorPlanModel(model_dir='pretrained')
8
+
9
+ def predict_floorplan(image):
10
+ # image: PIL Image from Gradio
11
+ result = model.predict(image)
12
+ # Convert numpy array to PIL Image for Gradio output
13
+ return Image.fromarray(result.astype(np.uint8))
14
+
15
+ iface = gr.Interface(
16
+ fn=predict_floorplan,
17
+ inputs=gr.Image(type="pil", label="Upload Floorplan Image"),
18
+ outputs=gr.Image(type="pil", label="Predicted Segmentation"),
19
+ title="Deep Floor Plan Segmentation",
20
+ description="Upload a floorplan image to get the predicted segmentation using the Deep Floor Plan model.",
21
+ allow_flagging="never"
22
+ )
23
+
24
+ if __name__ == "__main__":
25
+ iface.launch()
dataset/download_links.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ https://mycuhk-my.sharepoint.com/:f:/g/personal/1155052510_link_cuhk_edu_hk/EseSIeHQgPxArPlNpGdVp38BIjUg70jMiAO-w4f3s8B_dg?e=UXKbYO
dataset/newyork ADDED
@@ -0,0 +1 @@
 
 
1
+ /home/zlzeng/floorplan_v2/dataset/newyork
dataset/r2v_test.txt ADDED
@@ -0,0 +1,100 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ../dataset/jp/test/100_input.jpg ../dataset/jp/test/100_wall.png ../dataset/jp/test/100_close.png ../dataset/jp/test/100_rooms.png ../dataset/jp/test/100_close_wall.png
2
+ ../dataset/jp/test/10_input.jpg ../dataset/jp/test/10_wall.png ../dataset/jp/test/10_close.png ../dataset/jp/test/10_rooms.png ../dataset/jp/test/10_close_wall.png
3
+ ../dataset/jp/test/11_input.jpg ../dataset/jp/test/11_wall.png ../dataset/jp/test/11_close.png ../dataset/jp/test/11_rooms.png ../dataset/jp/test/11_close_wall.png
4
+ ../dataset/jp/test/12_input.jpg ../dataset/jp/test/12_wall.png ../dataset/jp/test/12_close.png ../dataset/jp/test/12_rooms.png ../dataset/jp/test/12_close_wall.png
5
+ ../dataset/jp/test/13_input.jpg ../dataset/jp/test/13_wall.png ../dataset/jp/test/13_close.png ../dataset/jp/test/13_rooms.png ../dataset/jp/test/13_close_wall.png
6
+ ../dataset/jp/test/14_input.jpg ../dataset/jp/test/14_wall.png ../dataset/jp/test/14_close.png ../dataset/jp/test/14_rooms.png ../dataset/jp/test/14_close_wall.png
7
+ ../dataset/jp/test/15_input.jpg ../dataset/jp/test/15_wall.png ../dataset/jp/test/15_close.png ../dataset/jp/test/15_rooms.png ../dataset/jp/test/15_close_wall.png
8
+ ../dataset/jp/test/16_input.jpg ../dataset/jp/test/16_wall.png ../dataset/jp/test/16_close.png ../dataset/jp/test/16_rooms.png ../dataset/jp/test/16_close_wall.png
9
+ ../dataset/jp/test/17_input.jpg ../dataset/jp/test/17_wall.png ../dataset/jp/test/17_close.png ../dataset/jp/test/17_rooms.png ../dataset/jp/test/17_close_wall.png
10
+ ../dataset/jp/test/18_input.jpg ../dataset/jp/test/18_wall.png ../dataset/jp/test/18_close.png ../dataset/jp/test/18_rooms.png ../dataset/jp/test/18_close_wall.png
11
+ ../dataset/jp/test/19_input.jpg ../dataset/jp/test/19_wall.png ../dataset/jp/test/19_close.png ../dataset/jp/test/19_rooms.png ../dataset/jp/test/19_close_wall.png
12
+ ../dataset/jp/test/1_input.jpg ../dataset/jp/test/1_wall.png ../dataset/jp/test/1_close.png ../dataset/jp/test/1_rooms.png ../dataset/jp/test/1_close_wall.png
13
+ ../dataset/jp/test/20_input.jpg ../dataset/jp/test/20_wall.png ../dataset/jp/test/20_close.png ../dataset/jp/test/20_rooms.png ../dataset/jp/test/20_close_wall.png
14
+ ../dataset/jp/test/21_input.jpg ../dataset/jp/test/21_wall.png ../dataset/jp/test/21_close.png ../dataset/jp/test/21_rooms.png ../dataset/jp/test/21_close_wall.png
15
+ ../dataset/jp/test/22_input.jpg ../dataset/jp/test/22_wall.png ../dataset/jp/test/22_close.png ../dataset/jp/test/22_rooms.png ../dataset/jp/test/22_close_wall.png
16
+ ../dataset/jp/test/23_input.jpg ../dataset/jp/test/23_wall.png ../dataset/jp/test/23_close.png ../dataset/jp/test/23_rooms.png ../dataset/jp/test/23_close_wall.png
17
+ ../dataset/jp/test/24_input.jpg ../dataset/jp/test/24_wall.png ../dataset/jp/test/24_close.png ../dataset/jp/test/24_rooms.png ../dataset/jp/test/24_close_wall.png
18
+ ../dataset/jp/test/25_input.jpg ../dataset/jp/test/25_wall.png ../dataset/jp/test/25_close.png ../dataset/jp/test/25_rooms.png ../dataset/jp/test/25_close_wall.png
19
+ ../dataset/jp/test/26_input.jpg ../dataset/jp/test/26_wall.png ../dataset/jp/test/26_close.png ../dataset/jp/test/26_rooms.png ../dataset/jp/test/26_close_wall.png
20
+ ../dataset/jp/test/27_input.jpg ../dataset/jp/test/27_wall.png ../dataset/jp/test/27_close.png ../dataset/jp/test/27_rooms.png ../dataset/jp/test/27_close_wall.png
21
+ ../dataset/jp/test/28_input.jpg ../dataset/jp/test/28_wall.png ../dataset/jp/test/28_close.png ../dataset/jp/test/28_rooms.png ../dataset/jp/test/28_close_wall.png
22
+ ../dataset/jp/test/29_input.jpg ../dataset/jp/test/29_wall.png ../dataset/jp/test/29_close.png ../dataset/jp/test/29_rooms.png ../dataset/jp/test/29_close_wall.png
23
+ ../dataset/jp/test/2_input.jpg ../dataset/jp/test/2_wall.png ../dataset/jp/test/2_close.png ../dataset/jp/test/2_rooms.png ../dataset/jp/test/2_close_wall.png
24
+ ../dataset/jp/test/30_input.jpg ../dataset/jp/test/30_wall.png ../dataset/jp/test/30_close.png ../dataset/jp/test/30_rooms.png ../dataset/jp/test/30_close_wall.png
25
+ ../dataset/jp/test/31_input.jpg ../dataset/jp/test/31_wall.png ../dataset/jp/test/31_close.png ../dataset/jp/test/31_rooms.png ../dataset/jp/test/31_close_wall.png
26
+ ../dataset/jp/test/32_input.jpg ../dataset/jp/test/32_wall.png ../dataset/jp/test/32_close.png ../dataset/jp/test/32_rooms.png ../dataset/jp/test/32_close_wall.png
27
+ ../dataset/jp/test/33_input.jpg ../dataset/jp/test/33_wall.png ../dataset/jp/test/33_close.png ../dataset/jp/test/33_rooms.png ../dataset/jp/test/33_close_wall.png
28
+ ../dataset/jp/test/34_input.jpg ../dataset/jp/test/34_wall.png ../dataset/jp/test/34_close.png ../dataset/jp/test/34_rooms.png ../dataset/jp/test/34_close_wall.png
29
+ ../dataset/jp/test/35_input.jpg ../dataset/jp/test/35_wall.png ../dataset/jp/test/35_close.png ../dataset/jp/test/35_rooms.png ../dataset/jp/test/35_close_wall.png
30
+ ../dataset/jp/test/36_input.jpg ../dataset/jp/test/36_wall.png ../dataset/jp/test/36_close.png ../dataset/jp/test/36_rooms.png ../dataset/jp/test/36_close_wall.png
31
+ ../dataset/jp/test/37_input.jpg ../dataset/jp/test/37_wall.png ../dataset/jp/test/37_close.png ../dataset/jp/test/37_rooms.png ../dataset/jp/test/37_close_wall.png
32
+ ../dataset/jp/test/38_input.jpg ../dataset/jp/test/38_wall.png ../dataset/jp/test/38_close.png ../dataset/jp/test/38_rooms.png ../dataset/jp/test/38_close_wall.png
33
+ ../dataset/jp/test/39_input.jpg ../dataset/jp/test/39_wall.png ../dataset/jp/test/39_close.png ../dataset/jp/test/39_rooms.png ../dataset/jp/test/39_close_wall.png
34
+ ../dataset/jp/test/3_input.jpg ../dataset/jp/test/3_wall.png ../dataset/jp/test/3_close.png ../dataset/jp/test/3_rooms.png ../dataset/jp/test/3_close_wall.png
35
+ ../dataset/jp/test/40_input.jpg ../dataset/jp/test/40_wall.png ../dataset/jp/test/40_close.png ../dataset/jp/test/40_rooms.png ../dataset/jp/test/40_close_wall.png
36
+ ../dataset/jp/test/41_input.jpg ../dataset/jp/test/41_wall.png ../dataset/jp/test/41_close.png ../dataset/jp/test/41_rooms.png ../dataset/jp/test/41_close_wall.png
37
+ ../dataset/jp/test/42_input.jpg ../dataset/jp/test/42_wall.png ../dataset/jp/test/42_close.png ../dataset/jp/test/42_rooms.png ../dataset/jp/test/42_close_wall.png
38
+ ../dataset/jp/test/43_input.jpg ../dataset/jp/test/43_wall.png ../dataset/jp/test/43_close.png ../dataset/jp/test/43_rooms.png ../dataset/jp/test/43_close_wall.png
39
+ ../dataset/jp/test/44_input.jpg ../dataset/jp/test/44_wall.png ../dataset/jp/test/44_close.png ../dataset/jp/test/44_rooms.png ../dataset/jp/test/44_close_wall.png
40
+ ../dataset/jp/test/45_input.jpg ../dataset/jp/test/45_wall.png ../dataset/jp/test/45_close.png ../dataset/jp/test/45_rooms.png ../dataset/jp/test/45_close_wall.png
41
+ ../dataset/jp/test/46_input.jpg ../dataset/jp/test/46_wall.png ../dataset/jp/test/46_close.png ../dataset/jp/test/46_rooms.png ../dataset/jp/test/46_close_wall.png
42
+ ../dataset/jp/test/47_input.jpg ../dataset/jp/test/47_wall.png ../dataset/jp/test/47_close.png ../dataset/jp/test/47_rooms.png ../dataset/jp/test/47_close_wall.png
43
+ ../dataset/jp/test/48_input.jpg ../dataset/jp/test/48_wall.png ../dataset/jp/test/48_close.png ../dataset/jp/test/48_rooms.png ../dataset/jp/test/48_close_wall.png
44
+ ../dataset/jp/test/49_input.jpg ../dataset/jp/test/49_wall.png ../dataset/jp/test/49_close.png ../dataset/jp/test/49_rooms.png ../dataset/jp/test/49_close_wall.png
45
+ ../dataset/jp/test/4_input.jpg ../dataset/jp/test/4_wall.png ../dataset/jp/test/4_close.png ../dataset/jp/test/4_rooms.png ../dataset/jp/test/4_close_wall.png
46
+ ../dataset/jp/test/50_input.jpg ../dataset/jp/test/50_wall.png ../dataset/jp/test/50_close.png ../dataset/jp/test/50_rooms.png ../dataset/jp/test/50_close_wall.png
47
+ ../dataset/jp/test/51_input.jpg ../dataset/jp/test/51_wall.png ../dataset/jp/test/51_close.png ../dataset/jp/test/51_rooms.png ../dataset/jp/test/51_close_wall.png
48
+ ../dataset/jp/test/52_input.jpg ../dataset/jp/test/52_wall.png ../dataset/jp/test/52_close.png ../dataset/jp/test/52_rooms.png ../dataset/jp/test/52_close_wall.png
49
+ ../dataset/jp/test/53_input.jpg ../dataset/jp/test/53_wall.png ../dataset/jp/test/53_close.png ../dataset/jp/test/53_rooms.png ../dataset/jp/test/53_close_wall.png
50
+ ../dataset/jp/test/54_input.jpg ../dataset/jp/test/54_wall.png ../dataset/jp/test/54_close.png ../dataset/jp/test/54_rooms.png ../dataset/jp/test/54_close_wall.png
51
+ ../dataset/jp/test/55_input.jpg ../dataset/jp/test/55_wall.png ../dataset/jp/test/55_close.png ../dataset/jp/test/55_rooms.png ../dataset/jp/test/55_close_wall.png
52
+ ../dataset/jp/test/56_input.jpg ../dataset/jp/test/56_wall.png ../dataset/jp/test/56_close.png ../dataset/jp/test/56_rooms.png ../dataset/jp/test/56_close_wall.png
53
+ ../dataset/jp/test/57_input.jpg ../dataset/jp/test/57_wall.png ../dataset/jp/test/57_close.png ../dataset/jp/test/57_rooms.png ../dataset/jp/test/57_close_wall.png
54
+ ../dataset/jp/test/58_input.jpg ../dataset/jp/test/58_wall.png ../dataset/jp/test/58_close.png ../dataset/jp/test/58_rooms.png ../dataset/jp/test/58_close_wall.png
55
+ ../dataset/jp/test/59_input.jpg ../dataset/jp/test/59_wall.png ../dataset/jp/test/59_close.png ../dataset/jp/test/59_rooms.png ../dataset/jp/test/59_close_wall.png
56
+ ../dataset/jp/test/5_input.jpg ../dataset/jp/test/5_wall.png ../dataset/jp/test/5_close.png ../dataset/jp/test/5_rooms.png ../dataset/jp/test/5_close_wall.png
57
+ ../dataset/jp/test/60_input.jpg ../dataset/jp/test/60_wall.png ../dataset/jp/test/60_close.png ../dataset/jp/test/60_rooms.png ../dataset/jp/test/60_close_wall.png
58
+ ../dataset/jp/test/61_input.jpg ../dataset/jp/test/61_wall.png ../dataset/jp/test/61_close.png ../dataset/jp/test/61_rooms.png ../dataset/jp/test/61_close_wall.png
59
+ ../dataset/jp/test/62_input.jpg ../dataset/jp/test/62_wall.png ../dataset/jp/test/62_close.png ../dataset/jp/test/62_rooms.png ../dataset/jp/test/62_close_wall.png
60
+ ../dataset/jp/test/63_input.jpg ../dataset/jp/test/63_wall.png ../dataset/jp/test/63_close.png ../dataset/jp/test/63_rooms.png ../dataset/jp/test/63_close_wall.png
61
+ ../dataset/jp/test/64_input.jpg ../dataset/jp/test/64_wall.png ../dataset/jp/test/64_close.png ../dataset/jp/test/64_rooms.png ../dataset/jp/test/64_close_wall.png
62
+ ../dataset/jp/test/65_input.jpg ../dataset/jp/test/65_wall.png ../dataset/jp/test/65_close.png ../dataset/jp/test/65_rooms.png ../dataset/jp/test/65_close_wall.png
63
+ ../dataset/jp/test/66_input.jpg ../dataset/jp/test/66_wall.png ../dataset/jp/test/66_close.png ../dataset/jp/test/66_rooms.png ../dataset/jp/test/66_close_wall.png
64
+ ../dataset/jp/test/67_input.jpg ../dataset/jp/test/67_wall.png ../dataset/jp/test/67_close.png ../dataset/jp/test/67_rooms.png ../dataset/jp/test/67_close_wall.png
65
+ ../dataset/jp/test/68_input.jpg ../dataset/jp/test/68_wall.png ../dataset/jp/test/68_close.png ../dataset/jp/test/68_rooms.png ../dataset/jp/test/68_close_wall.png
66
+ ../dataset/jp/test/69_input.jpg ../dataset/jp/test/69_wall.png ../dataset/jp/test/69_close.png ../dataset/jp/test/69_rooms.png ../dataset/jp/test/69_close_wall.png
67
+ ../dataset/jp/test/6_input.jpg ../dataset/jp/test/6_wall.png ../dataset/jp/test/6_close.png ../dataset/jp/test/6_rooms.png ../dataset/jp/test/6_close_wall.png
68
+ ../dataset/jp/test/70_input.jpg ../dataset/jp/test/70_wall.png ../dataset/jp/test/70_close.png ../dataset/jp/test/70_rooms.png ../dataset/jp/test/70_close_wall.png
69
+ ../dataset/jp/test/71_input.jpg ../dataset/jp/test/71_wall.png ../dataset/jp/test/71_close.png ../dataset/jp/test/71_rooms.png ../dataset/jp/test/71_close_wall.png
70
+ ../dataset/jp/test/72_input.jpg ../dataset/jp/test/72_wall.png ../dataset/jp/test/72_close.png ../dataset/jp/test/72_rooms.png ../dataset/jp/test/72_close_wall.png
71
+ ../dataset/jp/test/73_input.jpg ../dataset/jp/test/73_wall.png ../dataset/jp/test/73_close.png ../dataset/jp/test/73_rooms.png ../dataset/jp/test/73_close_wall.png
72
+ ../dataset/jp/test/74_input.jpg ../dataset/jp/test/74_wall.png ../dataset/jp/test/74_close.png ../dataset/jp/test/74_rooms.png ../dataset/jp/test/74_close_wall.png
73
+ ../dataset/jp/test/75_input.jpg ../dataset/jp/test/75_wall.png ../dataset/jp/test/75_close.png ../dataset/jp/test/75_rooms.png ../dataset/jp/test/75_close_wall.png
74
+ ../dataset/jp/test/76_input.jpg ../dataset/jp/test/76_wall.png ../dataset/jp/test/76_close.png ../dataset/jp/test/76_rooms.png ../dataset/jp/test/76_close_wall.png
75
+ ../dataset/jp/test/77_input.jpg ../dataset/jp/test/77_wall.png ../dataset/jp/test/77_close.png ../dataset/jp/test/77_rooms.png ../dataset/jp/test/77_close_wall.png
76
+ ../dataset/jp/test/78_input.jpg ../dataset/jp/test/78_wall.png ../dataset/jp/test/78_close.png ../dataset/jp/test/78_rooms.png ../dataset/jp/test/78_close_wall.png
77
+ ../dataset/jp/test/79_input.jpg ../dataset/jp/test/79_wall.png ../dataset/jp/test/79_close.png ../dataset/jp/test/79_rooms.png ../dataset/jp/test/79_close_wall.png
78
+ ../dataset/jp/test/7_input.jpg ../dataset/jp/test/7_wall.png ../dataset/jp/test/7_close.png ../dataset/jp/test/7_rooms.png ../dataset/jp/test/7_close_wall.png
79
+ ../dataset/jp/test/80_input.jpg ../dataset/jp/test/80_wall.png ../dataset/jp/test/80_close.png ../dataset/jp/test/80_rooms.png ../dataset/jp/test/80_close_wall.png
80
+ ../dataset/jp/test/81_input.jpg ../dataset/jp/test/81_wall.png ../dataset/jp/test/81_close.png ../dataset/jp/test/81_rooms.png ../dataset/jp/test/81_close_wall.png
81
+ ../dataset/jp/test/82_input.jpg ../dataset/jp/test/82_wall.png ../dataset/jp/test/82_close.png ../dataset/jp/test/82_rooms.png ../dataset/jp/test/82_close_wall.png
82
+ ../dataset/jp/test/83_input.jpg ../dataset/jp/test/83_wall.png ../dataset/jp/test/83_close.png ../dataset/jp/test/83_rooms.png ../dataset/jp/test/83_close_wall.png
83
+ ../dataset/jp/test/84_input.jpg ../dataset/jp/test/84_wall.png ../dataset/jp/test/84_close.png ../dataset/jp/test/84_rooms.png ../dataset/jp/test/84_close_wall.png
84
+ ../dataset/jp/test/85_input.jpg ../dataset/jp/test/85_wall.png ../dataset/jp/test/85_close.png ../dataset/jp/test/85_rooms.png ../dataset/jp/test/85_close_wall.png
85
+ ../dataset/jp/test/86_input.jpg ../dataset/jp/test/86_wall.png ../dataset/jp/test/86_close.png ../dataset/jp/test/86_rooms.png ../dataset/jp/test/86_close_wall.png
86
+ ../dataset/jp/test/87_input.jpg ../dataset/jp/test/87_wall.png ../dataset/jp/test/87_close.png ../dataset/jp/test/87_rooms.png ../dataset/jp/test/87_close_wall.png
87
+ ../dataset/jp/test/88_input.jpg ../dataset/jp/test/88_wall.png ../dataset/jp/test/88_close.png ../dataset/jp/test/88_rooms.png ../dataset/jp/test/88_close_wall.png
88
+ ../dataset/jp/test/89_input.jpg ../dataset/jp/test/89_wall.png ../dataset/jp/test/89_close.png ../dataset/jp/test/89_rooms.png ../dataset/jp/test/89_close_wall.png
89
+ ../dataset/jp/test/8_input.jpg ../dataset/jp/test/8_wall.png ../dataset/jp/test/8_close.png ../dataset/jp/test/8_rooms.png ../dataset/jp/test/8_close_wall.png
90
+ ../dataset/jp/test/90_input.jpg ../dataset/jp/test/90_wall.png ../dataset/jp/test/90_close.png ../dataset/jp/test/90_rooms.png ../dataset/jp/test/90_close_wall.png
91
+ ../dataset/jp/test/91_input.jpg ../dataset/jp/test/91_wall.png ../dataset/jp/test/91_close.png ../dataset/jp/test/91_rooms.png ../dataset/jp/test/91_close_wall.png
92
+ ../dataset/jp/test/92_input.jpg ../dataset/jp/test/92_wall.png ../dataset/jp/test/92_close.png ../dataset/jp/test/92_rooms.png ../dataset/jp/test/92_close_wall.png
93
+ ../dataset/jp/test/93_input.jpg ../dataset/jp/test/93_wall.png ../dataset/jp/test/93_close.png ../dataset/jp/test/93_rooms.png ../dataset/jp/test/93_close_wall.png
94
+ ../dataset/jp/test/94_input.jpg ../dataset/jp/test/94_wall.png ../dataset/jp/test/94_close.png ../dataset/jp/test/94_rooms.png ../dataset/jp/test/94_close_wall.png
95
+ ../dataset/jp/test/95_input.jpg ../dataset/jp/test/95_wall.png ../dataset/jp/test/95_close.png ../dataset/jp/test/95_rooms.png ../dataset/jp/test/95_close_wall.png
96
+ ../dataset/jp/test/96_input.jpg ../dataset/jp/test/96_wall.png ../dataset/jp/test/96_close.png ../dataset/jp/test/96_rooms.png ../dataset/jp/test/96_close_wall.png
97
+ ../dataset/jp/test/97_input.jpg ../dataset/jp/test/97_wall.png ../dataset/jp/test/97_close.png ../dataset/jp/test/97_rooms.png ../dataset/jp/test/97_close_wall.png
98
+ ../dataset/jp/test/98_input.jpg ../dataset/jp/test/98_wall.png ../dataset/jp/test/98_close.png ../dataset/jp/test/98_rooms.png ../dataset/jp/test/98_close_wall.png
99
+ ../dataset/jp/test/99_input.jpg ../dataset/jp/test/99_wall.png ../dataset/jp/test/99_close.png ../dataset/jp/test/99_rooms.png ../dataset/jp/test/99_close_wall.png
100
+ ../dataset/jp/test/9_input.jpg ../dataset/jp/test/9_wall.png ../dataset/jp/test/9_close.png ../dataset/jp/test/9_rooms.png ../dataset/jp/test/9_close_wall.png
dataset/r2v_train.txt ADDED
The diff for this file is too large to render. See raw diff
 
dataset/r3d_test.txt ADDED
@@ -0,0 +1,53 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ../dataset/newyork/test/21.jpg ../dataset/newyork/test/21_wall.png ../dataset/newyork/test/21_close.png ../dataset/newyork/test/21_rooms.png ../dataset/newyork/test/21_close_wall.png
2
+ ../dataset/newyork/test/30691830.jpg ../dataset/newyork/test/30691830_wall.png ../dataset/newyork/test/30691830_close.png ../dataset/newyork/test/30691830_rooms.png ../dataset/newyork/test/30691830_close_wall.png
3
+ ../dataset/newyork/test/31074492.jpg ../dataset/newyork/test/31074492_wall.png ../dataset/newyork/test/31074492_close.png ../dataset/newyork/test/31074492_rooms.png ../dataset/newyork/test/31074492_close_wall.png
4
+ ../dataset/newyork/test/31837524.jpg ../dataset/newyork/test/31837524_wall.png ../dataset/newyork/test/31837524_close.png ../dataset/newyork/test/31837524_rooms.png ../dataset/newyork/test/31837524_close_wall.png
5
+ ../dataset/newyork/test/31851141.jpg ../dataset/newyork/test/31851141_wall.png ../dataset/newyork/test/31851141_close.png ../dataset/newyork/test/31851141_rooms.png ../dataset/newyork/test/31851141_close_wall.png
6
+ ../dataset/newyork/test/31873188.jpg ../dataset/newyork/test/31873188_wall.png ../dataset/newyork/test/31873188_close.png ../dataset/newyork/test/31873188_rooms.png ../dataset/newyork/test/31873188_close_wall.png
7
+ ../dataset/newyork/test/31889856.jpg ../dataset/newyork/test/31889856_wall.png ../dataset/newyork/test/31889856_close.png ../dataset/newyork/test/31889856_rooms.png ../dataset/newyork/test/31889856_close_wall.png
8
+ ../dataset/newyork/test/43949851.jpg ../dataset/newyork/test/43949851_wall.png ../dataset/newyork/test/43949851_close.png ../dataset/newyork/test/43949851_rooms.png ../dataset/newyork/test/43949851_close_wall.png
9
+ ../dataset/newyork/test/44777104.jpg ../dataset/newyork/test/44777104_wall.png ../dataset/newyork/test/44777104_close.png ../dataset/newyork/test/44777104_rooms.png ../dataset/newyork/test/44777104_close_wall.png
10
+ ../dataset/newyork/test/45157357.jpg ../dataset/newyork/test/45157357_wall.png ../dataset/newyork/test/45157357_close.png ../dataset/newyork/test/45157357_rooms.png ../dataset/newyork/test/45157357_close_wall.png
11
+ ../dataset/newyork/test/45299197.jpg ../dataset/newyork/test/45299197_wall.png ../dataset/newyork/test/45299197_close.png ../dataset/newyork/test/45299197_rooms.png ../dataset/newyork/test/45299197_close_wall.png
12
+ ../dataset/newyork/test/45348658.jpg ../dataset/newyork/test/45348658_wall.png ../dataset/newyork/test/45348658_close.png ../dataset/newyork/test/45348658_rooms.png ../dataset/newyork/test/45348658_close_wall.png
13
+ ../dataset/newyork/test/45719584.jpg ../dataset/newyork/test/45719584_wall.png ../dataset/newyork/test/45719584_close.png ../dataset/newyork/test/45719584_rooms.png ../dataset/newyork/test/45719584_close_wall.png
14
+ ../dataset/newyork/test/45720004.jpg ../dataset/newyork/test/45720004_wall.png ../dataset/newyork/test/45720004_close.png ../dataset/newyork/test/45720004_rooms.png ../dataset/newyork/test/45720004_close_wall.png
15
+ ../dataset/newyork/test/45724132.jpg ../dataset/newyork/test/45724132_wall.png ../dataset/newyork/test/45724132_close.png ../dataset/newyork/test/45724132_rooms.png ../dataset/newyork/test/45724132_close_wall.png
16
+ ../dataset/newyork/test/45724363.jpg ../dataset/newyork/test/45724363_wall.png ../dataset/newyork/test/45724363_close.png ../dataset/newyork/test/45724363_rooms.png ../dataset/newyork/test/45724363_close_wall.png
17
+ ../dataset/newyork/test/45724372.jpg ../dataset/newyork/test/45724372_wall.png ../dataset/newyork/test/45724372_close.png ../dataset/newyork/test/45724372_rooms.png ../dataset/newyork/test/45724372_close_wall.png
18
+ ../dataset/newyork/test/45740533.jpg ../dataset/newyork/test/45740533_wall.png ../dataset/newyork/test/45740533_close.png ../dataset/newyork/test/45740533_rooms.png ../dataset/newyork/test/45740533_close_wall.png
19
+ ../dataset/newyork/test/45765448.jpg ../dataset/newyork/test/45765448_wall.png ../dataset/newyork/test/45765448_close.png ../dataset/newyork/test/45765448_rooms.png ../dataset/newyork/test/45765448_close_wall.png
20
+ ../dataset/newyork/test/45775069.jpg ../dataset/newyork/test/45775069_wall.png ../dataset/newyork/test/45775069_close.png ../dataset/newyork/test/45775069_rooms.png ../dataset/newyork/test/45775069_close_wall.png
21
+ ../dataset/newyork/test/45780715.jpg ../dataset/newyork/test/45780715_wall.png ../dataset/newyork/test/45780715_close.png ../dataset/newyork/test/45780715_rooms.png ../dataset/newyork/test/45780715_close_wall.png
22
+ ../dataset/newyork/test/46543250.jpg ../dataset/newyork/test/46543250_wall.png ../dataset/newyork/test/46543250_close.png ../dataset/newyork/test/46543250_rooms.png ../dataset/newyork/test/46543250_close_wall.png
23
+ ../dataset/newyork/test/47464145.jpg ../dataset/newyork/test/47464145_wall.png ../dataset/newyork/test/47464145_close.png ../dataset/newyork/test/47464145_rooms.png ../dataset/newyork/test/47464145_close_wall.png
24
+ ../dataset/newyork/test/47485670.jpg ../dataset/newyork/test/47485670_wall.png ../dataset/newyork/test/47485670_close.png ../dataset/newyork/test/47485670_rooms.png ../dataset/newyork/test/47485670_close_wall.png
25
+ ../dataset/newyork/test/47489612.jpg ../dataset/newyork/test/47489612_wall.png ../dataset/newyork/test/47489612_close.png ../dataset/newyork/test/47489612_rooms.png ../dataset/newyork/test/47489612_close_wall.png
26
+ ../dataset/newyork/test/47499272.jpg ../dataset/newyork/test/47499272_wall.png ../dataset/newyork/test/47499272_close.png ../dataset/newyork/test/47499272_rooms.png ../dataset/newyork/test/47499272_close_wall.png
27
+ ../dataset/newyork/test/47499362.jpg ../dataset/newyork/test/47499362_wall.png ../dataset/newyork/test/47499362_close.png ../dataset/newyork/test/47499362_rooms.png ../dataset/newyork/test/47499362_close_wall.png
28
+ ../dataset/newyork/test/47505362.jpg ../dataset/newyork/test/47505362_wall.png ../dataset/newyork/test/47505362_close.png ../dataset/newyork/test/47505362_rooms.png ../dataset/newyork/test/47505362_close_wall.png
29
+ ../dataset/newyork/test/47525504.jpg ../dataset/newyork/test/47525504_wall.png ../dataset/newyork/test/47525504_close.png ../dataset/newyork/test/47525504_rooms.png ../dataset/newyork/test/47525504_close_wall.png
30
+ ../dataset/newyork/test/47541842.jpg ../dataset/newyork/test/47541842_wall.png ../dataset/newyork/test/47541842_close.png ../dataset/newyork/test/47541842_rooms.png ../dataset/newyork/test/47541842_close_wall.png
31
+ ../dataset/newyork/test/47541845.jpg ../dataset/newyork/test/47541845_wall.png ../dataset/newyork/test/47541845_close.png ../dataset/newyork/test/47541845_rooms.png ../dataset/newyork/test/47541845_close_wall.png
32
+ ../dataset/newyork/test/47541857.jpg ../dataset/newyork/test/47541857_wall.png ../dataset/newyork/test/47541857_close.png ../dataset/newyork/test/47541857_rooms.png ../dataset/newyork/test/47541857_close_wall.png
33
+ ../dataset/newyork/test/47541860.jpg ../dataset/newyork/test/47541860_wall.png ../dataset/newyork/test/47541860_close.png ../dataset/newyork/test/47541860_rooms.png ../dataset/newyork/test/47541860_close_wall.png
34
+ ../dataset/newyork/test/47541863.jpg ../dataset/newyork/test/47541863_wall.png ../dataset/newyork/test/47541863_close.png ../dataset/newyork/test/47541863_rooms.png ../dataset/newyork/test/47541863_close_wall.png
35
+ ../dataset/newyork/test/47541866.jpg ../dataset/newyork/test/47541866_wall.png ../dataset/newyork/test/47541866_close.png ../dataset/newyork/test/47541866_rooms.png ../dataset/newyork/test/47541866_close_wall.png
36
+ ../dataset/newyork/test/47542733.jpg ../dataset/newyork/test/47542733_wall.png ../dataset/newyork/test/47542733_close.png ../dataset/newyork/test/47542733_rooms.png ../dataset/newyork/test/47542733_close_wall.png
37
+ ../dataset/newyork/test/47542745.jpg ../dataset/newyork/test/47542745_wall.png ../dataset/newyork/test/47542745_close.png ../dataset/newyork/test/47542745_rooms.png ../dataset/newyork/test/47542745_close_wall.png
38
+ ../dataset/newyork/test/47545139.jpg ../dataset/newyork/test/47545139_wall.png ../dataset/newyork/test/47545139_close.png ../dataset/newyork/test/47545139_rooms.png ../dataset/newyork/test/47545139_close_wall.png
39
+ ../dataset/newyork/test/47545145.jpg ../dataset/newyork/test/47545145_wall.png ../dataset/newyork/test/47545145_close.png ../dataset/newyork/test/47545145_rooms.png ../dataset/newyork/test/47545145_close_wall.png
40
+ ../dataset/newyork/test/47545148.jpg ../dataset/newyork/test/47545148_wall.png ../dataset/newyork/test/47545148_close.png ../dataset/newyork/test/47545148_rooms.png ../dataset/newyork/test/47545148_close_wall.png
41
+ ../dataset/newyork/test/47545160.jpg ../dataset/newyork/test/47545160_wall.png ../dataset/newyork/test/47545160_close.png ../dataset/newyork/test/47545160_rooms.png ../dataset/newyork/test/47545160_close_wall.png
42
+ ../dataset/newyork/test/47546432.jpg ../dataset/newyork/test/47546432_wall.png ../dataset/newyork/test/47546432_close.png ../dataset/newyork/test/47546432_rooms.png ../dataset/newyork/test/47546432_close_wall.png
43
+ ../dataset/newyork/test/47546639.jpg ../dataset/newyork/test/47546639_wall.png ../dataset/newyork/test/47546639_close.png ../dataset/newyork/test/47546639_rooms.png ../dataset/newyork/test/47546639_close_wall.png
44
+ ../dataset/newyork/test/47546846.jpg ../dataset/newyork/test/47546846_wall.png ../dataset/newyork/test/47546846_close.png ../dataset/newyork/test/47546846_rooms.png ../dataset/newyork/test/47546846_close_wall.png
45
+ ../dataset/newyork/test/47547656.jpg ../dataset/newyork/test/47547656_wall.png ../dataset/newyork/test/47547656_close.png ../dataset/newyork/test/47547656_rooms.png ../dataset/newyork/test/47547656_close_wall.png
46
+ ../dataset/newyork/test/47548484.jpg ../dataset/newyork/test/47548484_wall.png ../dataset/newyork/test/47548484_close.png ../dataset/newyork/test/47548484_rooms.png ../dataset/newyork/test/47548484_close_wall.png
47
+ ../dataset/newyork/test/47548487.jpg ../dataset/newyork/test/47548487_wall.png ../dataset/newyork/test/47548487_close.png ../dataset/newyork/test/47548487_rooms.png ../dataset/newyork/test/47548487_close_wall.png
48
+ ../dataset/newyork/test/55.jpg ../dataset/newyork/test/55_wall.png ../dataset/newyork/test/55_close.png ../dataset/newyork/test/55_rooms.png ../dataset/newyork/test/55_close_wall.png
49
+ ../dataset/newyork/test/60.jpg ../dataset/newyork/test/60_wall.png ../dataset/newyork/test/60_close.png ../dataset/newyork/test/60_rooms.png ../dataset/newyork/test/60_close_wall.png
50
+ ../dataset/newyork/test/62.jpg ../dataset/newyork/test/62_wall.png ../dataset/newyork/test/62_close.png ../dataset/newyork/test/62_rooms.png ../dataset/newyork/test/62_close_wall.png
51
+ ../dataset/newyork/test/65.jpg ../dataset/newyork/test/65_wall.png ../dataset/newyork/test/65_close.png ../dataset/newyork/test/65_rooms.png ../dataset/newyork/test/65_close_wall.png
52
+ ../dataset/newyork/test/75.jpg ../dataset/newyork/test/75_wall.png ../dataset/newyork/test/75_close.png ../dataset/newyork/test/75_rooms.png ../dataset/newyork/test/75_close_wall.png
53
+ ../dataset/newyork/test/9.jpg ../dataset/newyork/test/9_wall.png ../dataset/newyork/test/9_close.png ../dataset/newyork/test/9_rooms.png ../dataset/newyork/test/9_close_wall.png
dataset/r3d_train.txt ADDED
@@ -0,0 +1,179 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ../dataset/newyork/train/10.jpg ../dataset/newyork/train/10_wall.png ../dataset/newyork/train/10_close.png ../dataset/newyork/train/10_rooms.png ../dataset/newyork/train/10_close_wall.png
2
+ ../dataset/newyork/train/28025487.jpg ../dataset/newyork/train/28025487_wall.png ../dataset/newyork/train/28025487_close.png ../dataset/newyork/train/28025487_rooms.png ../dataset/newyork/train/28025487_close_wall.png
3
+ ../dataset/newyork/train/28906422.jpg ../dataset/newyork/train/28906422_wall.png ../dataset/newyork/train/28906422_close.png ../dataset/newyork/train/28906422_rooms.png ../dataset/newyork/train/28906422_close_wall.png
4
+ ../dataset/newyork/train/2.jpg ../dataset/newyork/train/2_wall.png ../dataset/newyork/train/2_close.png ../dataset/newyork/train/2_rooms.png ../dataset/newyork/train/2_close_wall.png
5
+ ../dataset/newyork/train/30044076.jpg ../dataset/newyork/train/30044076_wall.png ../dataset/newyork/train/30044076_close.png ../dataset/newyork/train/30044076_rooms.png ../dataset/newyork/train/30044076_close_wall.png
6
+ ../dataset/newyork/train/30049107.jpg ../dataset/newyork/train/30049107_wall.png ../dataset/newyork/train/30049107_close.png ../dataset/newyork/train/30049107_rooms.png ../dataset/newyork/train/30049107_close_wall.png
7
+ ../dataset/newyork/train/30615117.jpg ../dataset/newyork/train/30615117_wall.png ../dataset/newyork/train/30615117_close.png ../dataset/newyork/train/30615117_rooms.png ../dataset/newyork/train/30615117_close_wall.png
8
+ ../dataset/newyork/train/30939153.jpg ../dataset/newyork/train/30939153_wall.png ../dataset/newyork/train/30939153_close.png ../dataset/newyork/train/30939153_rooms.png ../dataset/newyork/train/30939153_close_wall.png
9
+ ../dataset/newyork/train/30957516.jpg ../dataset/newyork/train/30957516_wall.png ../dataset/newyork/train/30957516_close.png ../dataset/newyork/train/30957516_rooms.png ../dataset/newyork/train/30957516_close_wall.png
10
+ ../dataset/newyork/train/31036152.jpg ../dataset/newyork/train/31036152_wall.png ../dataset/newyork/train/31036152_close.png ../dataset/newyork/train/31036152_rooms.png ../dataset/newyork/train/31036152_close_wall.png
11
+ ../dataset/newyork/train/31234182.jpg ../dataset/newyork/train/31234182_wall.png ../dataset/newyork/train/31234182_close.png ../dataset/newyork/train/31234182_rooms.png ../dataset/newyork/train/31234182_close_wall.png
12
+ ../dataset/newyork/train/31272420.jpg ../dataset/newyork/train/31272420_wall.png ../dataset/newyork/train/31272420_close.png ../dataset/newyork/train/31272420_rooms.png ../dataset/newyork/train/31272420_close_wall.png
13
+ ../dataset/newyork/train/31318404.jpg ../dataset/newyork/train/31318404_wall.png ../dataset/newyork/train/31318404_close.png ../dataset/newyork/train/31318404_rooms.png ../dataset/newyork/train/31318404_close_wall.png
14
+ ../dataset/newyork/train/31418847.jpg ../dataset/newyork/train/31418847_wall.png ../dataset/newyork/train/31418847_close.png ../dataset/newyork/train/31418847_rooms.png ../dataset/newyork/train/31418847_close_wall.png
15
+ ../dataset/newyork/train/31431717.jpg ../dataset/newyork/train/31431717_wall.png ../dataset/newyork/train/31431717_close.png ../dataset/newyork/train/31431717_rooms.png ../dataset/newyork/train/31431717_close_wall.png
16
+ ../dataset/newyork/train/31450071.jpg ../dataset/newyork/train/31450071_wall.png ../dataset/newyork/train/31450071_close.png ../dataset/newyork/train/31450071_rooms.png ../dataset/newyork/train/31450071_close_wall.png
17
+ ../dataset/newyork/train/31483593.jpg ../dataset/newyork/train/31483593_wall.png ../dataset/newyork/train/31483593_close.png ../dataset/newyork/train/31483593_rooms.png ../dataset/newyork/train/31483593_close_wall.png
18
+ ../dataset/newyork/train/31491612.jpg ../dataset/newyork/train/31491612_wall.png ../dataset/newyork/train/31491612_close.png ../dataset/newyork/train/31491612_rooms.png ../dataset/newyork/train/31491612_close_wall.png
19
+ ../dataset/newyork/train/31566489.jpg ../dataset/newyork/train/31566489_wall.png ../dataset/newyork/train/31566489_close.png ../dataset/newyork/train/31566489_rooms.png ../dataset/newyork/train/31566489_close_wall.png
20
+ ../dataset/newyork/train/31567842.jpg ../dataset/newyork/train/31567842_wall.png ../dataset/newyork/train/31567842_close.png ../dataset/newyork/train/31567842_rooms.png ../dataset/newyork/train/31567842_close_wall.png
21
+ ../dataset/newyork/train/31573533.jpg ../dataset/newyork/train/31573533_wall.png ../dataset/newyork/train/31573533_close.png ../dataset/newyork/train/31573533_rooms.png ../dataset/newyork/train/31573533_close_wall.png
22
+ ../dataset/newyork/train/31677402.jpg ../dataset/newyork/train/31677402_wall.png ../dataset/newyork/train/31677402_close.png ../dataset/newyork/train/31677402_rooms.png ../dataset/newyork/train/31677402_close_wall.png
23
+ ../dataset/newyork/train/31683135.jpg ../dataset/newyork/train/31683135_wall.png ../dataset/newyork/train/31683135_close.png ../dataset/newyork/train/31683135_rooms.png ../dataset/newyork/train/31683135_close_wall.png
24
+ ../dataset/newyork/train/31727418.jpg ../dataset/newyork/train/31727418_wall.png ../dataset/newyork/train/31727418_close.png ../dataset/newyork/train/31727418_rooms.png ../dataset/newyork/train/31727418_close_wall.png
25
+ ../dataset/newyork/train/31814460.jpg ../dataset/newyork/train/31814460_wall.png ../dataset/newyork/train/31814460_close.png ../dataset/newyork/train/31814460_rooms.png ../dataset/newyork/train/31814460_close_wall.png
26
+ ../dataset/newyork/train/31820961.jpg ../dataset/newyork/train/31820961_wall.png ../dataset/newyork/train/31820961_close.png ../dataset/newyork/train/31820961_rooms.png ../dataset/newyork/train/31820961_close_wall.png
27
+ ../dataset/newyork/train/31826949.jpg ../dataset/newyork/train/31826949_wall.png ../dataset/newyork/train/31826949_close.png ../dataset/newyork/train/31826949_rooms.png ../dataset/newyork/train/31826949_close_wall.png
28
+ ../dataset/newyork/train/31829949.jpg ../dataset/newyork/train/31829949_wall.png ../dataset/newyork/train/31829949_close.png ../dataset/newyork/train/31829949_rooms.png ../dataset/newyork/train/31829949_close_wall.png
29
+ ../dataset/newyork/train/31830006.jpg ../dataset/newyork/train/31830006_wall.png ../dataset/newyork/train/31830006_close.png ../dataset/newyork/train/31830006_rooms.png ../dataset/newyork/train/31830006_close_wall.png
30
+ ../dataset/newyork/train/31830138.jpg ../dataset/newyork/train/31830138_wall.png ../dataset/newyork/train/31830138_close.png ../dataset/newyork/train/31830138_rooms.png ../dataset/newyork/train/31830138_close_wall.png
31
+ ../dataset/newyork/train/31830141.jpg ../dataset/newyork/train/31830141_wall.png ../dataset/newyork/train/31830141_close.png ../dataset/newyork/train/31830141_rooms.png ../dataset/newyork/train/31830141_close_wall.png
32
+ ../dataset/newyork/train/31830270.jpg ../dataset/newyork/train/31830270_wall.png ../dataset/newyork/train/31830270_close.png ../dataset/newyork/train/31830270_rooms.png ../dataset/newyork/train/31830270_close_wall.png
33
+ ../dataset/newyork/train/31833933.jpg ../dataset/newyork/train/31833933_wall.png ../dataset/newyork/train/31833933_close.png ../dataset/newyork/train/31833933_rooms.png ../dataset/newyork/train/31833933_close_wall.png
34
+ ../dataset/newyork/train/31834719.jpg ../dataset/newyork/train/31834719_wall.png ../dataset/newyork/train/31834719_close.png ../dataset/newyork/train/31834719_rooms.png ../dataset/newyork/train/31834719_close_wall.png
35
+ ../dataset/newyork/train/31834734.jpg ../dataset/newyork/train/31834734_wall.png ../dataset/newyork/train/31834734_close.png ../dataset/newyork/train/31834734_rooms.png ../dataset/newyork/train/31834734_close_wall.png
36
+ ../dataset/newyork/train/31835886.jpg ../dataset/newyork/train/31835886_wall.png ../dataset/newyork/train/31835886_close.png ../dataset/newyork/train/31835886_rooms.png ../dataset/newyork/train/31835886_close_wall.png
37
+ ../dataset/newyork/train/31847853.jpg ../dataset/newyork/train/31847853_wall.png ../dataset/newyork/train/31847853_close.png ../dataset/newyork/train/31847853_rooms.png ../dataset/newyork/train/31847853_close_wall.png
38
+ ../dataset/newyork/train/31850325.jpg ../dataset/newyork/train/31850325_wall.png ../dataset/newyork/train/31850325_close.png ../dataset/newyork/train/31850325_rooms.png ../dataset/newyork/train/31850325_close_wall.png
39
+ ../dataset/newyork/train/31850409.jpg ../dataset/newyork/train/31850409_wall.png ../dataset/newyork/train/31850409_close.png ../dataset/newyork/train/31850409_rooms.png ../dataset/newyork/train/31850409_close_wall.png
40
+ ../dataset/newyork/train/31852926.jpg ../dataset/newyork/train/31852926_wall.png ../dataset/newyork/train/31852926_close.png ../dataset/newyork/train/31852926_rooms.png ../dataset/newyork/train/31852926_close_wall.png
41
+ ../dataset/newyork/train/31852929.jpg ../dataset/newyork/train/31852929_wall.png ../dataset/newyork/train/31852929_close.png ../dataset/newyork/train/31852929_rooms.png ../dataset/newyork/train/31852929_close_wall.png
42
+ ../dataset/newyork/train/31852932.jpg ../dataset/newyork/train/31852932_wall.png ../dataset/newyork/train/31852932_close.png ../dataset/newyork/train/31852932_rooms.png ../dataset/newyork/train/31852932_close_wall.png
43
+ ../dataset/newyork/train/31857804.jpg ../dataset/newyork/train/31857804_wall.png ../dataset/newyork/train/31857804_close.png ../dataset/newyork/train/31857804_rooms.png ../dataset/newyork/train/31857804_close_wall.png
44
+ ../dataset/newyork/train/31868853.jpg ../dataset/newyork/train/31868853_wall.png ../dataset/newyork/train/31868853_close.png ../dataset/newyork/train/31868853_rooms.png ../dataset/newyork/train/31868853_close_wall.png
45
+ ../dataset/newyork/train/31870182.jpg ../dataset/newyork/train/31870182_wall.png ../dataset/newyork/train/31870182_close.png ../dataset/newyork/train/31870182_rooms.png ../dataset/newyork/train/31870182_close_wall.png
46
+ ../dataset/newyork/train/31870983.jpg ../dataset/newyork/train/31870983_wall.png ../dataset/newyork/train/31870983_close.png ../dataset/newyork/train/31870983_rooms.png ../dataset/newyork/train/31870983_close_wall.png
47
+ ../dataset/newyork/train/31871118.jpg ../dataset/newyork/train/31871118_wall.png ../dataset/newyork/train/31871118_close.png ../dataset/newyork/train/31871118_rooms.png ../dataset/newyork/train/31871118_close_wall.png
48
+ ../dataset/newyork/train/31871448.jpg ../dataset/newyork/train/31871448_wall.png ../dataset/newyork/train/31871448_close.png ../dataset/newyork/train/31871448_rooms.png ../dataset/newyork/train/31871448_close_wall.png
49
+ ../dataset/newyork/train/31872336.jpg ../dataset/newyork/train/31872336_wall.png ../dataset/newyork/train/31872336_close.png ../dataset/newyork/train/31872336_rooms.png ../dataset/newyork/train/31872336_close_wall.png
50
+ ../dataset/newyork/train/31872645.jpg ../dataset/newyork/train/31872645_wall.png ../dataset/newyork/train/31872645_close.png ../dataset/newyork/train/31872645_rooms.png ../dataset/newyork/train/31872645_close_wall.png
51
+ ../dataset/newyork/train/31873326.jpg ../dataset/newyork/train/31873326_wall.png ../dataset/newyork/train/31873326_close.png ../dataset/newyork/train/31873326_rooms.png ../dataset/newyork/train/31873326_close_wall.png
52
+ ../dataset/newyork/train/31874937.jpg ../dataset/newyork/train/31874937_wall.png ../dataset/newyork/train/31874937_close.png ../dataset/newyork/train/31874937_rooms.png ../dataset/newyork/train/31874937_close_wall.png
53
+ ../dataset/newyork/train/31878534.jpg ../dataset/newyork/train/31878534_wall.png ../dataset/newyork/train/31878534_close.png ../dataset/newyork/train/31878534_rooms.png ../dataset/newyork/train/31878534_close_wall.png
54
+ ../dataset/newyork/train/31878567.jpg ../dataset/newyork/train/31878567_wall.png ../dataset/newyork/train/31878567_close.png ../dataset/newyork/train/31878567_rooms.png ../dataset/newyork/train/31878567_close_wall.png
55
+ ../dataset/newyork/train/31878750.jpg ../dataset/newyork/train/31878750_wall.png ../dataset/newyork/train/31878750_close.png ../dataset/newyork/train/31878750_rooms.png ../dataset/newyork/train/31878750_close_wall.png
56
+ ../dataset/newyork/train/31878855.jpg ../dataset/newyork/train/31878855_wall.png ../dataset/newyork/train/31878855_close.png ../dataset/newyork/train/31878855_rooms.png ../dataset/newyork/train/31878855_close_wall.png
57
+ ../dataset/newyork/train/31878867.jpg ../dataset/newyork/train/31878867_wall.png ../dataset/newyork/train/31878867_close.png ../dataset/newyork/train/31878867_rooms.png ../dataset/newyork/train/31878867_close_wall.png
58
+ ../dataset/newyork/train/31878870.jpg ../dataset/newyork/train/31878870_wall.png ../dataset/newyork/train/31878870_close.png ../dataset/newyork/train/31878870_rooms.png ../dataset/newyork/train/31878870_close_wall.png
59
+ ../dataset/newyork/train/31882362.jpg ../dataset/newyork/train/31882362_wall.png ../dataset/newyork/train/31882362_close.png ../dataset/newyork/train/31882362_rooms.png ../dataset/newyork/train/31882362_close_wall.png
60
+ ../dataset/newyork/train/31883016.jpg ../dataset/newyork/train/31883016_wall.png ../dataset/newyork/train/31883016_close.png ../dataset/newyork/train/31883016_rooms.png ../dataset/newyork/train/31883016_close_wall.png
61
+ ../dataset/newyork/train/31883034.jpg ../dataset/newyork/train/31883034_wall.png ../dataset/newyork/train/31883034_close.png ../dataset/newyork/train/31883034_rooms.png ../dataset/newyork/train/31883034_close_wall.png
62
+ ../dataset/newyork/train/31883331.jpg ../dataset/newyork/train/31883331_wall.png ../dataset/newyork/train/31883331_close.png ../dataset/newyork/train/31883331_rooms.png ../dataset/newyork/train/31883331_close_wall.png
63
+ ../dataset/newyork/train/31887483.jpg ../dataset/newyork/train/31887483_wall.png ../dataset/newyork/train/31887483_close.png ../dataset/newyork/train/31887483_rooms.png ../dataset/newyork/train/31887483_close_wall.png
64
+ ../dataset/newyork/train/31887492.jpg ../dataset/newyork/train/31887492_wall.png ../dataset/newyork/train/31887492_close.png ../dataset/newyork/train/31887492_rooms.png ../dataset/newyork/train/31887492_close_wall.png
65
+ ../dataset/newyork/train/31889847.jpg ../dataset/newyork/train/31889847_wall.png ../dataset/newyork/train/31889847_close.png ../dataset/newyork/train/31889847_rooms.png ../dataset/newyork/train/31889847_close_wall.png
66
+ ../dataset/newyork/train/31890228.jpg ../dataset/newyork/train/31890228_wall.png ../dataset/newyork/train/31890228_close.png ../dataset/newyork/train/31890228_rooms.png ../dataset/newyork/train/31890228_close_wall.png
67
+ ../dataset/newyork/train/38877131.jpg ../dataset/newyork/train/38877131_wall.png ../dataset/newyork/train/38877131_close.png ../dataset/newyork/train/38877131_rooms.png ../dataset/newyork/train/38877131_close_wall.png
68
+ ../dataset/newyork/train/39.jpg ../dataset/newyork/train/39_wall.png ../dataset/newyork/train/39_close.png ../dataset/newyork/train/39_rooms.png ../dataset/newyork/train/39_close_wall.png
69
+ ../dataset/newyork/train/3.jpg ../dataset/newyork/train/3_wall.png ../dataset/newyork/train/3_close.png ../dataset/newyork/train/3_rooms.png ../dataset/newyork/train/3_close_wall.png
70
+ ../dataset/newyork/train/41459443.jpg ../dataset/newyork/train/41459443_wall.png ../dataset/newyork/train/41459443_close.png ../dataset/newyork/train/41459443_rooms.png ../dataset/newyork/train/41459443_close_wall.png
71
+ ../dataset/newyork/train/42761030.jpg ../dataset/newyork/train/42761030_wall.png ../dataset/newyork/train/42761030_close.png ../dataset/newyork/train/42761030_rooms.png ../dataset/newyork/train/42761030_close_wall.png
72
+ ../dataset/newyork/train/43169833.jpg ../dataset/newyork/train/43169833_wall.png ../dataset/newyork/train/43169833_close.png ../dataset/newyork/train/43169833_rooms.png ../dataset/newyork/train/43169833_close_wall.png
73
+ ../dataset/newyork/train/43661446.jpg ../dataset/newyork/train/43661446_wall.png ../dataset/newyork/train/43661446_close.png ../dataset/newyork/train/43661446_rooms.png ../dataset/newyork/train/43661446_close_wall.png
74
+ ../dataset/newyork/train/43778570.jpg ../dataset/newyork/train/43778570_wall.png ../dataset/newyork/train/43778570_close.png ../dataset/newyork/train/43778570_rooms.png ../dataset/newyork/train/43778570_close_wall.png
75
+ ../dataset/newyork/train/44356523.jpg ../dataset/newyork/train/44356523_wall.png ../dataset/newyork/train/44356523_close.png ../dataset/newyork/train/44356523_rooms.png ../dataset/newyork/train/44356523_close_wall.png
76
+ ../dataset/newyork/train/44591497.jpg ../dataset/newyork/train/44591497_wall.png ../dataset/newyork/train/44591497_close.png ../dataset/newyork/train/44591497_rooms.png ../dataset/newyork/train/44591497_close_wall.png
77
+ ../dataset/newyork/train/44637031.jpg ../dataset/newyork/train/44637031_wall.png ../dataset/newyork/train/44637031_close.png ../dataset/newyork/train/44637031_rooms.png ../dataset/newyork/train/44637031_close_wall.png
78
+ ../dataset/newyork/train/44867164.jpg ../dataset/newyork/train/44867164_wall.png ../dataset/newyork/train/44867164_close.png ../dataset/newyork/train/44867164_rooms.png ../dataset/newyork/train/44867164_close_wall.png
79
+ ../dataset/newyork/train/45057754.jpg ../dataset/newyork/train/45057754_wall.png ../dataset/newyork/train/45057754_close.png ../dataset/newyork/train/45057754_rooms.png ../dataset/newyork/train/45057754_close_wall.png
80
+ ../dataset/newyork/train/45098284.jpg ../dataset/newyork/train/45098284_wall.png ../dataset/newyork/train/45098284_close.png ../dataset/newyork/train/45098284_rooms.png ../dataset/newyork/train/45098284_close_wall.png
81
+ ../dataset/newyork/train/45175987.jpg ../dataset/newyork/train/45175987_wall.png ../dataset/newyork/train/45175987_close.png ../dataset/newyork/train/45175987_rooms.png ../dataset/newyork/train/45175987_close_wall.png
82
+ ../dataset/newyork/train/45243139.jpg ../dataset/newyork/train/45243139_wall.png ../dataset/newyork/train/45243139_close.png ../dataset/newyork/train/45243139_rooms.png ../dataset/newyork/train/45243139_close_wall.png
83
+ ../dataset/newyork/train/45293770.jpg ../dataset/newyork/train/45293770_wall.png ../dataset/newyork/train/45293770_close.png ../dataset/newyork/train/45293770_rooms.png ../dataset/newyork/train/45293770_close_wall.png
84
+ ../dataset/newyork/train/45321949.jpg ../dataset/newyork/train/45321949_wall.png ../dataset/newyork/train/45321949_close.png ../dataset/newyork/train/45321949_rooms.png ../dataset/newyork/train/45321949_close_wall.png
85
+ ../dataset/newyork/train/45352198.jpg ../dataset/newyork/train/45352198_wall.png ../dataset/newyork/train/45352198_close.png ../dataset/newyork/train/45352198_rooms.png ../dataset/newyork/train/45352198_close_wall.png
86
+ ../dataset/newyork/train/45552859.jpg ../dataset/newyork/train/45552859_wall.png ../dataset/newyork/train/45552859_close.png ../dataset/newyork/train/45552859_rooms.png ../dataset/newyork/train/45552859_close_wall.png
87
+ ../dataset/newyork/train/45562633.jpg ../dataset/newyork/train/45562633_wall.png ../dataset/newyork/train/45562633_close.png ../dataset/newyork/train/45562633_rooms.png ../dataset/newyork/train/45562633_close_wall.png
88
+ ../dataset/newyork/train/45591070.jpg ../dataset/newyork/train/45591070_wall.png ../dataset/newyork/train/45591070_close.png ../dataset/newyork/train/45591070_rooms.png ../dataset/newyork/train/45591070_close_wall.png
89
+ ../dataset/newyork/train/45591514.jpg ../dataset/newyork/train/45591514_wall.png ../dataset/newyork/train/45591514_close.png ../dataset/newyork/train/45591514_rooms.png ../dataset/newyork/train/45591514_close_wall.png
90
+ ../dataset/newyork/train/45608287.jpg ../dataset/newyork/train/45608287_wall.png ../dataset/newyork/train/45608287_close.png ../dataset/newyork/train/45608287_rooms.png ../dataset/newyork/train/45608287_close_wall.png
91
+ ../dataset/newyork/train/45613198.jpg ../dataset/newyork/train/45613198_wall.png ../dataset/newyork/train/45613198_close.png ../dataset/newyork/train/45613198_rooms.png ../dataset/newyork/train/45613198_close_wall.png
92
+ ../dataset/newyork/train/45633769.jpg ../dataset/newyork/train/45633769_wall.png ../dataset/newyork/train/45633769_close.png ../dataset/newyork/train/45633769_rooms.png ../dataset/newyork/train/45633769_close_wall.png
93
+ ../dataset/newyork/train/45665893.jpg ../dataset/newyork/train/45665893_wall.png ../dataset/newyork/train/45665893_close.png ../dataset/newyork/train/45665893_rooms.png ../dataset/newyork/train/45665893_close_wall.png
94
+ ../dataset/newyork/train/45708481.jpg ../dataset/newyork/train/45708481_wall.png ../dataset/newyork/train/45708481_close.png ../dataset/newyork/train/45708481_rooms.png ../dataset/newyork/train/45708481_close_wall.png
95
+ ../dataset/newyork/train/45714223.jpg ../dataset/newyork/train/45714223_wall.png ../dataset/newyork/train/45714223_close.png ../dataset/newyork/train/45714223_rooms.png ../dataset/newyork/train/45714223_close_wall.png
96
+ ../dataset/newyork/train/45716029.jpg ../dataset/newyork/train/45716029_wall.png ../dataset/newyork/train/45716029_close.png ../dataset/newyork/train/45716029_rooms.png ../dataset/newyork/train/45716029_close_wall.png
97
+ ../dataset/newyork/train/45719912.jpg ../dataset/newyork/train/45719912_wall.png ../dataset/newyork/train/45719912_close.png ../dataset/newyork/train/45719912_rooms.png ../dataset/newyork/train/45719912_close_wall.png
98
+ ../dataset/newyork/train/45720001.jpg ../dataset/newyork/train/45720001_wall.png ../dataset/newyork/train/45720001_close.png ../dataset/newyork/train/45720001_rooms.png ../dataset/newyork/train/45720001_close_wall.png
99
+ ../dataset/newyork/train/45720007.jpg ../dataset/newyork/train/45720007_wall.png ../dataset/newyork/train/45720007_close.png ../dataset/newyork/train/45720007_rooms.png ../dataset/newyork/train/45720007_close_wall.png
100
+ ../dataset/newyork/train/45724006.jpg ../dataset/newyork/train/45724006_wall.png ../dataset/newyork/train/45724006_close.png ../dataset/newyork/train/45724006_rooms.png ../dataset/newyork/train/45724006_close_wall.png
101
+ ../dataset/newyork/train/45724069.jpg ../dataset/newyork/train/45724069_wall.png ../dataset/newyork/train/45724069_close.png ../dataset/newyork/train/45724069_rooms.png ../dataset/newyork/train/45724069_close_wall.png
102
+ ../dataset/newyork/train/45724129.jpg ../dataset/newyork/train/45724129_wall.png ../dataset/newyork/train/45724129_close.png ../dataset/newyork/train/45724129_rooms.png ../dataset/newyork/train/45724129_close_wall.png
103
+ ../dataset/newyork/train/45724327.jpg ../dataset/newyork/train/45724327_wall.png ../dataset/newyork/train/45724327_close.png ../dataset/newyork/train/45724327_rooms.png ../dataset/newyork/train/45724327_close_wall.png
104
+ ../dataset/newyork/train/45724339.jpg ../dataset/newyork/train/45724339_wall.png ../dataset/newyork/train/45724339_close.png ../dataset/newyork/train/45724339_rooms.png ../dataset/newyork/train/45724339_close_wall.png
105
+ ../dataset/newyork/train/45724342.jpg ../dataset/newyork/train/45724342_wall.png ../dataset/newyork/train/45724342_close.png ../dataset/newyork/train/45724342_rooms.png ../dataset/newyork/train/45724342_close_wall.png
106
+ ../dataset/newyork/train/45724345.jpg ../dataset/newyork/train/45724345_wall.png ../dataset/newyork/train/45724345_close.png ../dataset/newyork/train/45724345_rooms.png ../dataset/newyork/train/45724345_close_wall.png
107
+ ../dataset/newyork/train/45724375.jpg ../dataset/newyork/train/45724375_wall.png ../dataset/newyork/train/45724375_close.png ../dataset/newyork/train/45724375_rooms.png ../dataset/newyork/train/45724375_close_wall.png
108
+ ../dataset/newyork/train/45724399.jpg ../dataset/newyork/train/45724399_wall.png ../dataset/newyork/train/45724399_close.png ../dataset/newyork/train/45724399_rooms.png ../dataset/newyork/train/45724399_close_wall.png
109
+ ../dataset/newyork/train/45724414.jpg ../dataset/newyork/train/45724414_wall.png ../dataset/newyork/train/45724414_close.png ../dataset/newyork/train/45724414_rooms.png ../dataset/newyork/train/45724414_close_wall.png
110
+ ../dataset/newyork/train/45724468.jpg ../dataset/newyork/train/45724468_wall.png ../dataset/newyork/train/45724468_close.png ../dataset/newyork/train/45724468_rooms.png ../dataset/newyork/train/45724468_close_wall.png
111
+ ../dataset/newyork/train/45725026.jpg ../dataset/newyork/train/45725026_wall.png ../dataset/newyork/train/45725026_close.png ../dataset/newyork/train/45725026_rooms.png ../dataset/newyork/train/45725026_close_wall.png
112
+ ../dataset/newyork/train/45725035.jpg ../dataset/newyork/train/45725035_wall.png ../dataset/newyork/train/45725035_close.png ../dataset/newyork/train/45725035_rooms.png ../dataset/newyork/train/45725035_close_wall.png
113
+ ../dataset/newyork/train/45728986.jpg ../dataset/newyork/train/45728986_wall.png ../dataset/newyork/train/45728986_close.png ../dataset/newyork/train/45728986_rooms.png ../dataset/newyork/train/45728986_close_wall.png
114
+ ../dataset/newyork/train/45731434.jpg ../dataset/newyork/train/45731434_wall.png ../dataset/newyork/train/45731434_close.png ../dataset/newyork/train/45731434_rooms.png ../dataset/newyork/train/45731434_close_wall.png
115
+ ../dataset/newyork/train/45737995.jpg ../dataset/newyork/train/45737995_wall.png ../dataset/newyork/train/45737995_close.png ../dataset/newyork/train/45737995_rooms.png ../dataset/newyork/train/45737995_close_wall.png
116
+ ../dataset/newyork/train/45740521.jpg ../dataset/newyork/train/45740521_wall.png ../dataset/newyork/train/45740521_close.png ../dataset/newyork/train/45740521_rooms.png ../dataset/newyork/train/45740521_close_wall.png
117
+ ../dataset/newyork/train/45740536.jpg ../dataset/newyork/train/45740536_wall.png ../dataset/newyork/train/45740536_close.png ../dataset/newyork/train/45740536_rooms.png ../dataset/newyork/train/45740536_close_wall.png
118
+ ../dataset/newyork/train/45740878.jpg ../dataset/newyork/train/45740878_wall.png ../dataset/newyork/train/45740878_close.png ../dataset/newyork/train/45740878_rooms.png ../dataset/newyork/train/45740878_close_wall.png
119
+ ../dataset/newyork/train/45740968.jpg ../dataset/newyork/train/45740968_wall.png ../dataset/newyork/train/45740968_close.png ../dataset/newyork/train/45740968_rooms.png ../dataset/newyork/train/45740968_close_wall.png
120
+ ../dataset/newyork/train/45741076.jpg ../dataset/newyork/train/45741076_wall.png ../dataset/newyork/train/45741076_close.png ../dataset/newyork/train/45741076_rooms.png ../dataset/newyork/train/45741076_close_wall.png
121
+ ../dataset/newyork/train/45741079.jpg ../dataset/newyork/train/45741079_wall.png ../dataset/newyork/train/45741079_close.png ../dataset/newyork/train/45741079_rooms.png ../dataset/newyork/train/45741079_close_wall.png
122
+ ../dataset/newyork/train/45741118.jpg ../dataset/newyork/train/45741118_wall.png ../dataset/newyork/train/45741118_close.png ../dataset/newyork/train/45741118_rooms.png ../dataset/newyork/train/45741118_close_wall.png
123
+ ../dataset/newyork/train/45741127.jpg ../dataset/newyork/train/45741127_wall.png ../dataset/newyork/train/45741127_close.png ../dataset/newyork/train/45741127_rooms.png ../dataset/newyork/train/45741127_close_wall.png
124
+ ../dataset/newyork/train/45743284.jpg ../dataset/newyork/train/45743284_wall.png ../dataset/newyork/train/45743284_close.png ../dataset/newyork/train/45743284_rooms.png ../dataset/newyork/train/45743284_close_wall.png
125
+ ../dataset/newyork/train/45756454.jpg ../dataset/newyork/train/45756454_wall.png ../dataset/newyork/train/45756454_close.png ../dataset/newyork/train/45756454_rooms.png ../dataset/newyork/train/45756454_close_wall.png
126
+ ../dataset/newyork/train/45763060.jpg ../dataset/newyork/train/45763060_wall.png ../dataset/newyork/train/45763060_close.png ../dataset/newyork/train/45763060_rooms.png ../dataset/newyork/train/45763060_close_wall.png
127
+ ../dataset/newyork/train/45764671.jpg ../dataset/newyork/train/45764671_wall.png ../dataset/newyork/train/45764671_close.png ../dataset/newyork/train/45764671_rooms.png ../dataset/newyork/train/45764671_close_wall.png
128
+ ../dataset/newyork/train/45764830.jpg ../dataset/newyork/train/45764830_wall.png ../dataset/newyork/train/45764830_close.png ../dataset/newyork/train/45764830_rooms.png ../dataset/newyork/train/45764830_close_wall.png
129
+ ../dataset/newyork/train/45765070.jpg ../dataset/newyork/train/45765070_wall.png ../dataset/newyork/train/45765070_close.png ../dataset/newyork/train/45765070_rooms.png ../dataset/newyork/train/45765070_close_wall.png
130
+ ../dataset/newyork/train/45765874.jpg ../dataset/newyork/train/45765874_wall.png ../dataset/newyork/train/45765874_close.png ../dataset/newyork/train/45765874_rooms.png ../dataset/newyork/train/45765874_close_wall.png
131
+ ../dataset/newyork/train/45774964.jpg ../dataset/newyork/train/45774964_wall.png ../dataset/newyork/train/45774964_close.png ../dataset/newyork/train/45774964_rooms.png ../dataset/newyork/train/45774964_close_wall.png
132
+ ../dataset/newyork/train/45775138.jpg ../dataset/newyork/train/45775138_wall.png ../dataset/newyork/train/45775138_close.png ../dataset/newyork/train/45775138_rooms.png ../dataset/newyork/train/45775138_close_wall.png
133
+ ../dataset/newyork/train/45775222.jpg ../dataset/newyork/train/45775222_wall.png ../dataset/newyork/train/45775222_close.png ../dataset/newyork/train/45775222_rooms.png ../dataset/newyork/train/45775222_close_wall.png
134
+ ../dataset/newyork/train/45775225.jpg ../dataset/newyork/train/45775225_wall.png ../dataset/newyork/train/45775225_close.png ../dataset/newyork/train/45775225_rooms.png ../dataset/newyork/train/45775225_close_wall.png
135
+ ../dataset/newyork/train/45775354.jpg ../dataset/newyork/train/45775354_wall.png ../dataset/newyork/train/45775354_close.png ../dataset/newyork/train/45775354_rooms.png ../dataset/newyork/train/45775354_close_wall.png
136
+ ../dataset/newyork/train/45775501.jpg ../dataset/newyork/train/45775501_wall.png ../dataset/newyork/train/45775501_close.png ../dataset/newyork/train/45775501_rooms.png ../dataset/newyork/train/45775501_close_wall.png
137
+ ../dataset/newyork/train/45775504.jpg ../dataset/newyork/train/45775504_wall.png ../dataset/newyork/train/45775504_close.png ../dataset/newyork/train/45775504_rooms.png ../dataset/newyork/train/45775504_close_wall.png
138
+ ../dataset/newyork/train/45775894.jpg ../dataset/newyork/train/45775894_wall.png ../dataset/newyork/train/45775894_close.png ../dataset/newyork/train/45775894_rooms.png ../dataset/newyork/train/45775894_close_wall.png
139
+ ../dataset/newyork/train/45777601.jpg ../dataset/newyork/train/45777601_wall.png ../dataset/newyork/train/45777601_close.png ../dataset/newyork/train/45777601_rooms.png ../dataset/newyork/train/45777601_close_wall.png
140
+ ../dataset/newyork/train/45777694.jpg ../dataset/newyork/train/45777694_wall.png ../dataset/newyork/train/45777694_close.png ../dataset/newyork/train/45777694_rooms.png ../dataset/newyork/train/45777694_close_wall.png
141
+ ../dataset/newyork/train/45780106.jpg ../dataset/newyork/train/45780106_wall.png ../dataset/newyork/train/45780106_close.png ../dataset/newyork/train/45780106_rooms.png ../dataset/newyork/train/45780106_close_wall.png
142
+ ../dataset/newyork/train/45780376.jpg ../dataset/newyork/train/45780376_wall.png ../dataset/newyork/train/45780376_close.png ../dataset/newyork/train/45780376_rooms.png ../dataset/newyork/train/45780376_close_wall.png
143
+ ../dataset/newyork/train/45781483.jpg ../dataset/newyork/train/45781483_wall.png ../dataset/newyork/train/45781483_close.png ../dataset/newyork/train/45781483_rooms.png ../dataset/newyork/train/45781483_close_wall.png
144
+ ../dataset/newyork/train/45783298.jpg ../dataset/newyork/train/45783298_wall.png ../dataset/newyork/train/45783298_close.png ../dataset/newyork/train/45783298_rooms.png ../dataset/newyork/train/45783298_close_wall.png
145
+ ../dataset/newyork/train/45783466.jpg ../dataset/newyork/train/45783466_wall.png ../dataset/newyork/train/45783466_close.png ../dataset/newyork/train/45783466_rooms.png ../dataset/newyork/train/45783466_close_wall.png
146
+ ../dataset/newyork/train/45.jpg ../dataset/newyork/train/45_wall.png ../dataset/newyork/train/45_close.png ../dataset/newyork/train/45_rooms.png ../dataset/newyork/train/45_close_wall.png
147
+ ../dataset/newyork/train/46452431.jpg ../dataset/newyork/train/46452431_wall.png ../dataset/newyork/train/46452431_close.png ../dataset/newyork/train/46452431_rooms.png ../dataset/newyork/train/46452431_close_wall.png
148
+ ../dataset/newyork/train/46678955.jpg ../dataset/newyork/train/46678955_wall.png ../dataset/newyork/train/46678955_close.png ../dataset/newyork/train/46678955_rooms.png ../dataset/newyork/train/46678955_close_wall.png
149
+ ../dataset/newyork/train/46781618.jpg ../dataset/newyork/train/46781618_wall.png ../dataset/newyork/train/46781618_close.png ../dataset/newyork/train/46781618_rooms.png ../dataset/newyork/train/46781618_close_wall.png
150
+ ../dataset/newyork/train/46807061.jpg ../dataset/newyork/train/46807061_wall.png ../dataset/newyork/train/46807061_close.png ../dataset/newyork/train/46807061_rooms.png ../dataset/newyork/train/46807061_close_wall.png
151
+ ../dataset/newyork/train/46.jpg ../dataset/newyork/train/46_wall.png ../dataset/newyork/train/46_close.png ../dataset/newyork/train/46_rooms.png ../dataset/newyork/train/46_close_wall.png
152
+ ../dataset/newyork/train/47073524.jpg ../dataset/newyork/train/47073524_wall.png ../dataset/newyork/train/47073524_close.png ../dataset/newyork/train/47073524_rooms.png ../dataset/newyork/train/47073524_close_wall.png
153
+ ../dataset/newyork/train/47185109.jpg ../dataset/newyork/train/47185109_wall.png ../dataset/newyork/train/47185109_close.png ../dataset/newyork/train/47185109_rooms.png ../dataset/newyork/train/47185109_close_wall.png
154
+ ../dataset/newyork/train/47236967.jpg ../dataset/newyork/train/47236967_wall.png ../dataset/newyork/train/47236967_close.png ../dataset/newyork/train/47236967_rooms.png ../dataset/newyork/train/47236967_close_wall.png
155
+ ../dataset/newyork/train/47325578.jpg ../dataset/newyork/train/47325578_wall.png ../dataset/newyork/train/47325578_close.png ../dataset/newyork/train/47325578_rooms.png ../dataset/newyork/train/47325578_close_wall.png
156
+ ../dataset/newyork/train/47360870.jpg ../dataset/newyork/train/47360870_wall.png ../dataset/newyork/train/47360870_close.png ../dataset/newyork/train/47360870_rooms.png ../dataset/newyork/train/47360870_close_wall.png
157
+ ../dataset/newyork/train/47429369.jpg ../dataset/newyork/train/47429369_wall.png ../dataset/newyork/train/47429369_close.png ../dataset/newyork/train/47429369_rooms.png ../dataset/newyork/train/47429369_close_wall.png
158
+ ../dataset/newyork/train/47464136.jpg ../dataset/newyork/train/47464136_wall.png ../dataset/newyork/train/47464136_close.png ../dataset/newyork/train/47464136_rooms.png ../dataset/newyork/train/47464136_close_wall.png
159
+ ../dataset/newyork/train/47464142.jpg ../dataset/newyork/train/47464142_wall.png ../dataset/newyork/train/47464142_close.png ../dataset/newyork/train/47464142_rooms.png ../dataset/newyork/train/47464142_close_wall.png
160
+ ../dataset/newyork/train/47464151.jpg ../dataset/newyork/train/47464151_wall.png ../dataset/newyork/train/47464151_close.png ../dataset/newyork/train/47464151_rooms.png ../dataset/newyork/train/47464151_close_wall.png
161
+ ../dataset/newyork/train/47465963.jpg ../dataset/newyork/train/47465963_wall.png ../dataset/newyork/train/47465963_close.png ../dataset/newyork/train/47465963_rooms.png ../dataset/newyork/train/47465963_close_wall.png
162
+ ../dataset/newyork/train/47484836.jpg ../dataset/newyork/train/47484836_wall.png ../dataset/newyork/train/47484836_close.png ../dataset/newyork/train/47484836_rooms.png ../dataset/newyork/train/47484836_close_wall.png
163
+ ../dataset/newyork/train/47489621.jpg ../dataset/newyork/train/47489621_wall.png ../dataset/newyork/train/47489621_close.png ../dataset/newyork/train/47489621_rooms.png ../dataset/newyork/train/47489621_close_wall.png
164
+ ../dataset/newyork/train/47489648.jpg ../dataset/newyork/train/47489648_wall.png ../dataset/newyork/train/47489648_close.png ../dataset/newyork/train/47489648_rooms.png ../dataset/newyork/train/47489648_close_wall.png
165
+ ../dataset/newyork/train/47490062.jpg ../dataset/newyork/train/47490062_wall.png ../dataset/newyork/train/47490062_close.png ../dataset/newyork/train/47490062_rooms.png ../dataset/newyork/train/47490062_close_wall.png
166
+ ../dataset/newyork/train/47492936.jpg ../dataset/newyork/train/47492936_wall.png ../dataset/newyork/train/47492936_close.png ../dataset/newyork/train/47492936_rooms.png ../dataset/newyork/train/47492936_close_wall.png
167
+ ../dataset/newyork/train/47499269.jpg ../dataset/newyork/train/47499269_wall.png ../dataset/newyork/train/47499269_close.png ../dataset/newyork/train/47499269_rooms.png ../dataset/newyork/train/47499269_close_wall.png
168
+ ../dataset/newyork/train/47499620.jpg ../dataset/newyork/train/47499620_wall.png ../dataset/newyork/train/47499620_close.png ../dataset/newyork/train/47499620_rooms.png ../dataset/newyork/train/47499620_close_wall.png
169
+ ../dataset/newyork/train/47503913.jpg ../dataset/newyork/train/47503913_wall.png ../dataset/newyork/train/47503913_close.png ../dataset/newyork/train/47503913_rooms.png ../dataset/newyork/train/47503913_close_wall.png
170
+ ../dataset/newyork/train/47505359.jpg ../dataset/newyork/train/47505359_wall.png ../dataset/newyork/train/47505359_close.png ../dataset/newyork/train/47505359_rooms.png ../dataset/newyork/train/47505359_close_wall.png
171
+ ../dataset/newyork/train/47508827.jpg ../dataset/newyork/train/47508827_wall.png ../dataset/newyork/train/47508827_close.png ../dataset/newyork/train/47508827_rooms.png ../dataset/newyork/train/47508827_close_wall.png
172
+ ../dataset/newyork/train/47514899.jpg ../dataset/newyork/train/47514899_wall.png ../dataset/newyork/train/47514899_close.png ../dataset/newyork/train/47514899_rooms.png ../dataset/newyork/train/47514899_close_wall.png
173
+ ../dataset/newyork/train/47514920.jpg ../dataset/newyork/train/47514920_wall.png ../dataset/newyork/train/47514920_close.png ../dataset/newyork/train/47514920_rooms.png ../dataset/newyork/train/47514920_close_wall.png
174
+ ../dataset/newyork/train/47534687.jpg ../dataset/newyork/train/47534687_wall.png ../dataset/newyork/train/47534687_close.png ../dataset/newyork/train/47534687_rooms.png ../dataset/newyork/train/47534687_close_wall.png
175
+ ../dataset/newyork/train/4.jpg ../dataset/newyork/train/4_wall.png ../dataset/newyork/train/4_close.png ../dataset/newyork/train/4_rooms.png ../dataset/newyork/train/4_close_wall.png
176
+ ../dataset/newyork/train/50.jpg ../dataset/newyork/train/50_wall.png ../dataset/newyork/train/50_close.png ../dataset/newyork/train/50_rooms.png ../dataset/newyork/train/50_close_wall.png
177
+ ../dataset/newyork/train/52.jpg ../dataset/newyork/train/52_wall.png ../dataset/newyork/train/52_close.png ../dataset/newyork/train/52_rooms.png ../dataset/newyork/train/52_close_wall.png
178
+ ../dataset/newyork/train/57.jpg ../dataset/newyork/train/57_wall.png ../dataset/newyork/train/57_close.png ../dataset/newyork/train/57_rooms.png ../dataset/newyork/train/57_close_wall.png
179
+ ../dataset/newyork/train/7.jpg ../dataset/newyork/train/7_wall.png ../dataset/newyork/train/7_close.png ../dataset/newyork/train/7_rooms.png ../dataset/newyork/train/7_close_wall.png
deepfloorplan_inference.py ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import numpy as np
3
+ import tensorflow as tf
4
+ from PIL import Image
5
+ import imageio
6
+ from net import Network
7
+ from utils.rgb_ind_convertor import ind2rgb, floorplan_fuse_map
8
+
9
+ class DeepFloorPlanModel:
10
+ def __init__(self, model_dir='pretrained', input_size=(512, 512)):
11
+ self.input_size = input_size
12
+ self.model_dir = model_dir
13
+ self._build_graph()
14
+ self._load_weights()
15
+
16
+ def _build_graph(self):
17
+ tf.compat.v1.reset_default_graph()
18
+ self.sess = tf.compat.v1.Session()
19
+ self.x = tf.compat.v1.placeholder(shape=[1, self.input_size[0], self.input_size[1], 3], dtype=tf.float32, name='inputs')
20
+ self.network = Network()
21
+ logits1, logits2 = self.network.forward(self.x, init_with_pretrain_vgg=False)
22
+ self.rooms = self.network.convert_one_hot_to_image(logits1, act='softmax', dtype='int')
23
+ self.close_walls = self.network.convert_one_hot_to_image(logits2, act='softmax', dtype='int')
24
+ self.sess.run(tf.compat.v1.global_variables_initializer())
25
+ self.sess.run(tf.compat.v1.local_variables_initializer())
26
+ self.saver = tf.compat.v1.train.Saver()
27
+
28
+ def _load_weights(self):
29
+ ckpt = tf.train.latest_checkpoint(self.model_dir)
30
+ if ckpt is None:
31
+ raise FileNotFoundError(f"No checkpoint found in {self.model_dir}")
32
+ self.saver.restore(self.sess, ckpt)
33
+
34
+ def predict(self, image):
35
+ # Accepts a numpy array or PIL image, returns a numpy array (segmentation mask)
36
+ if isinstance(image, Image.Image):
37
+ image = np.array(image)
38
+ if image.shape[-1] == 4:
39
+ image = image[..., :3]
40
+ im_resized = np.array(Image.fromarray(image).resize(self.input_size, Image.BICUBIC)) / 255.0
41
+ im_resized = im_resized.astype(np.float32)
42
+ im_resized = np.reshape(im_resized, (1, self.input_size[0], self.input_size[1], 3))
43
+ out1, out2 = self.sess.run([self.rooms, self.close_walls], feed_dict={self.x: im_resized})
44
+ out1 = np.squeeze(out1)
45
+ out2 = np.squeeze(out2)
46
+ # Merge logic: set out1 pixels to 9/10 where out2==1/2
47
+ out1[out2==2] = 10
48
+ out1[out2==1] = 9
49
+ # Convert to RGB for visualization
50
+ out_rgb = ind2rgb(out1, color_map=floorplan_fuse_map)
51
+ out_rgb = out_rgb.astype(np.uint8)
52
+ return out_rgb
53
+
54
+ def close(self):
55
+ self.sess.close()
demo.py ADDED
@@ -0,0 +1,89 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import argparse
3
+ import numpy as np
4
+ import tensorflow as tf
5
+
6
+ import imageio
7
+ from PIL import Image
8
+
9
+ from matplotlib import pyplot as plt
10
+
11
+ os.environ['CUDA_VISIBLE_DEVICES'] = '0'
12
+
13
+ os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
14
+
15
+ # input image path
16
+ parser = argparse.ArgumentParser()
17
+
18
+ parser.add_argument('--im_path', type=str, default='./demo/45765448.jpg',
19
+ help='input image paths.')
20
+
21
+ # color map
22
+ floorplan_map = {
23
+ 0: [255,255,255], # background
24
+ 1: [192,192,224], # closet
25
+ 2: [192,255,255], # batchroom/washroom
26
+ 3: [224,255,192], # livingroom/kitchen/dining room
27
+ 4: [255,224,128], # bedroom
28
+ 5: [255,160, 96], # hall
29
+ 6: [255,224,224], # balcony
30
+ 7: [255,255,255], # not used
31
+ 8: [255,255,255], # not used
32
+ 9: [255, 60,128], # door & window
33
+ 10:[ 0, 0, 0] # wall
34
+ }
35
+
36
+ def ind2rgb(ind_im, color_map=floorplan_map):
37
+ rgb_im = np.zeros((ind_im.shape[0], ind_im.shape[1], 3))
38
+
39
+ for i, rgb in color_map.items():
40
+ rgb_im[(ind_im==i)] = rgb
41
+
42
+ return rgb_im
43
+
44
+ def main(args):
45
+ # load input
46
+ im = imageio.imread(args.im_path, mode='RGB')
47
+ im = im.astype(np.float32)
48
+ im = PIL.Image.fromarray(im).resize((512,512,3)) / 255.
49
+
50
+ # create tensorflow session
51
+ with tf.Session() as sess:
52
+
53
+ # initialize
54
+ sess.run(tf.group(tf.global_variables_initializer(),
55
+ tf.local_variables_initializer()))
56
+
57
+ # restore pretrained model
58
+ saver = tf.train.import_meta_graph('./pretrained/pretrained_r3d.meta')
59
+ saver.restore(sess, './pretrained/pretrained_r3d')
60
+
61
+ # get default graph
62
+ graph = tf.get_default_graph()
63
+
64
+ # restore inputs & outpus tensor
65
+ x = graph.get_tensor_by_name('inputs:0')
66
+ room_type_logit = graph.get_tensor_by_name('Cast:0')
67
+ room_boundary_logit = graph.get_tensor_by_name('Cast_1:0')
68
+
69
+ # infer results
70
+ [room_type, room_boundary] = sess.run([room_type_logit, room_boundary_logit],\
71
+ feed_dict={x:im.reshape(1,512,512,3)})
72
+ room_type, room_boundary = np.squeeze(room_type), np.squeeze(room_boundary)
73
+
74
+ # merge results
75
+ floorplan = room_type.copy()
76
+ floorplan[room_boundary==1] = 9
77
+ floorplan[room_boundary==2] = 10
78
+ floorplan_rgb = ind2rgb(floorplan)
79
+
80
+ # plot results
81
+ plt.subplot(121)
82
+ plt.imshow(im)
83
+ plt.subplot(122)
84
+ plt.imshow(floorplan_rgb/255.)
85
+ plt.show()
86
+
87
+ if __name__ == '__main__':
88
+ FLAGS, unparsed = parser.parse_known_args()
89
+ main(FLAGS)
demo/45719584.jpg ADDED
demo/45765448.jpg ADDED
demo/47541863.jpg ADDED
main.py ADDED
@@ -0,0 +1,317 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import argparse
2
+ from net import *
3
+ import os
4
+ import time
5
+ import random
6
+ import numpy as np
7
+ import tensorflow as tf
8
+ import imageio
9
+ from PIL import Image
10
+
11
+ os.environ['CUDA_VISIBLE_DEVICES'] = GPU_ID
12
+
13
+ os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
14
+
15
+ seed = 8964
16
+
17
+ # input image path
18
+ parser = argparse.ArgumentParser()
19
+
20
+ parser.add_argument('--phase', type=str, default='Test',
21
+ help='Train/Test network.')
22
+
23
+ class MODEL(Network):
24
+ """docstring for MODEL"""
25
+ def __init__(self):
26
+ Network.__init__(self)
27
+ self.log_dir = 'pretrained'
28
+ self.eval_file = './dataset/r3d_test.txt'
29
+ self.loss_type = 'balanced'
30
+
31
+ def convert_one_hot_to_image(self, one_hot, dtype='float', act=None):
32
+ if act == 'softmax':
33
+ one_hot = tf.nn.softmax(one_hot, axis=-1)
34
+
35
+ [n, h, w, c] = one_hot.shape.as_list()
36
+
37
+ im = tf.reshape(tf.argmax(one_hot, axis=-1), [n, h, w, 1])
38
+ if dtype == 'int':
39
+ im = tf.cast(im, dtype=tf.uint8)
40
+ else:
41
+ im = tf.cast(im, dtype=tf.float32)
42
+ return im
43
+
44
+ def cross_two_tasks_weight(self, y1, y2):
45
+ p1 = tf.reduce_sum(y1)
46
+ p2 = tf.reduce_sum(y2)
47
+
48
+ w1 = p2 / (p1 + p2)
49
+ w2 = p1 / (p1 + p2)
50
+
51
+ return w1, w2
52
+
53
+ def balanced_entropy(self, x, y):
54
+ # cliped_by_eps
55
+ eps = 1e-6
56
+ z = tf.nn.softmax(x)
57
+ cliped_z = tf.clip_by_value(z, eps, 1-eps)
58
+ log_z = tf.log(cliped_z)
59
+
60
+ num_classes = y.shape.as_list()[-1]
61
+ ind = tf.argmax(y, -1, output_type=tf.int32)
62
+ # ind = tf.reshape(ind, shape=[1, 512, 512, 1]) # for debugging
63
+
64
+ total = tf.reduce_sum(y) # total foreground pixels
65
+
66
+ m_c = [] # index mask
67
+ n_c = [] # each class foreground pixels
68
+ for c in range(num_classes):
69
+ m_c.append(tf.cast(tf.equal(ind, c), dtype=tf.int32))
70
+ n_c.append(tf.cast(tf.reduce_sum(m_c[-1]), dtype=tf.float32))
71
+
72
+ # compute count
73
+ c = []
74
+ for i in range(num_classes):
75
+ c.append(total - n_c[i])
76
+ tc = tf.add_n(c)
77
+
78
+ # use for compute loss
79
+ loss = 0.
80
+ for i in range(num_classes):
81
+ w = c[i] / tc
82
+ m_c_one_hot = tf.one_hot((i*m_c[i]), num_classes, axis=-1)
83
+ y_c = m_c_one_hot*y
84
+
85
+ loss += w*tf.reduce_mean(-tf.reduce_sum(y_c*log_z, axis=1))
86
+
87
+ return (loss / num_classes) # mean
88
+
89
+ def train(self, loader_dict, num_batch, max_step=40000):
90
+ images = loader_dict['images']
91
+ labels_r_hot = loader_dict['label_rooms']
92
+ labels_cw_hot = loader_dict['label_boundaries']
93
+
94
+ max_ep = max_step // num_batch
95
+ print('max_step = {}, max_ep = {}, num_batch = {}'.format(max_step, max_ep, num_batch))
96
+
97
+ logits1, logits2 = self.forward(images, init_with_pretrain_vgg=False)
98
+
99
+ if self.loss_type == 'balanced':
100
+ # in-task loss balance
101
+ loss1 = self.balanced_entropy(logits1, labels_r_hot) # multi classes balance
102
+ loss2 = self.balanced_entropy(logits2, labels_cw_hot)
103
+ else:
104
+ loss1 = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits_v2(logits=logits1, labels=labels_r_hot, name='bce1'))
105
+ loss2 = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits_v2(logits=logits2, labels=labels_cw_hot, name='bce2'))
106
+
107
+ # compute cross loss balance weight
108
+ w1, w2 = self.cross_two_tasks_weight(labels_r_hot, labels_cw_hot)
109
+ loss = (w1*loss1 + w2*loss2)
110
+
111
+ optim = tf.train.AdamOptimizer(learning_rate=1e-4).minimize(loss, colocate_gradients_with_ops=True) # gradient ops assign to same device as forward ops
112
+
113
+ # # add image summary
114
+ # tf.summary.image('input', images)
115
+ # tf.summary.image('label_r', self.convert_one_hot_to_image(labels_r_hot))
116
+ # tf.summary.image('predict_room', self.convert_one_hot_to_image(logits1, act='softmax')) # room type to use argmax to visualize
117
+ # tf.summary.image('predict_close_wall', tf.nn.sigmoid(logits2)) # boundaries type to use argmax to visualize
118
+
119
+ # # add scalar summary
120
+ # tf.summary.scalar('bce', loss)
121
+
122
+ # define session
123
+ config = tf.ConfigProto(allow_soft_placement=True)
124
+ config.gpu_options.allow_growth=True # prevent the program occupies all GPU memory
125
+ with tf.Session(config=config) as sess:
126
+ # init all variables in graph
127
+ sess.run(tf.group(tf.global_variables_initializer(),
128
+ tf.local_variables_initializer()))
129
+
130
+ # saver
131
+ saver = tf.train.Saver(max_to_keep=10)
132
+
133
+ # filewriter for log info
134
+ # log_dir = self.log_dir+'/run-%02d%02d-%02d%02d' % tuple(time.localtime(time.time()))[1:5]
135
+ # writer = tf.summary.FileWriter(log_dir)
136
+ # merged = tf.summary.merge_all()
137
+
138
+ # coordinator for queue runner
139
+ coord = tf.train.Coordinator()
140
+
141
+ # start queue
142
+ threads = tf.train.start_queue_runners(sess=sess, coord=coord)
143
+
144
+ print("Start Training!")
145
+ total_times = 0
146
+
147
+ for ep in range(max_ep): # epoch loop
148
+ for n in range(num_batch): # batch loop
149
+ tic = time.time()
150
+ # [loss_value, update_value, summaries] = sess.run([loss, optim, merged])
151
+ [loss_value, update_value] = sess.run([loss, optim])
152
+ duration = time.time()-tic
153
+
154
+ total_times += duration
155
+
156
+ step = int(ep*num_batch + n)
157
+ # write log
158
+ print('step {}: loss = {:.3}; {:.2} data/sec, excuted {} minutes'.format(step,
159
+ loss_value, 1.0/duration, int(total_times/60)))
160
+ # writer.add_summary(summaries, global_step=step)
161
+ # save model parameters after 2 epoch training
162
+ if ep % 2 == 0:
163
+ saver.save(sess, self.log_dir+'/model', global_step=ep)
164
+ self.evaluate(sess=sess, epoch=ep)
165
+ saver.save(sess, self.log_dir+'/model', global_step=max_ep)
166
+ self.evaluate(sess=sess, epoch=max_ep)
167
+
168
+ # close session
169
+ coord.request_stop()
170
+ coord.join(threads)
171
+ sess.close()
172
+
173
+ def infer(self, save_dir='out', resize=True, merge=True):
174
+ print("generating test set of {}.... will save to [./{}]".format(self.eval_file, save_dir))
175
+ room_dir = os.path.join(save_dir, 'room')
176
+ close_wall_dir = os.path.join(save_dir, 'boundary')
177
+
178
+ if not os.path.exists(save_dir):
179
+ os.mkdir(save_dir)
180
+ if not os.path.exists(room_dir):
181
+ os.mkdir(room_dir)
182
+ if not os.path.exists(close_wall_dir):
183
+ os.mkdir(close_wall_dir)
184
+
185
+ x = tf.placeholder(shape=[1, 512, 512, 3], dtype=tf.float32)
186
+
187
+ logits1, logits2 = self.forward(x, init_with_pretrain_vgg=False)
188
+ rooms = self.convert_one_hot_to_image(logits1, act='softmax', dtype='int')
189
+ close_walls = self.convert_one_hot_to_image(logits2, act='softmax', dtype='int')
190
+
191
+ config = tf.ConfigProto(allow_soft_placement=True)
192
+ sess = tf.Session(config=config)
193
+ sess.run(tf.group(tf.global_variables_initializer(),
194
+ tf.local_variables_initializer()))
195
+
196
+ saver = tf.train.Saver() # restore all parameters
197
+ saver.restore(sess, save_path = tf.train.latest_checkpoint(self.log_dir))
198
+
199
+ # infer one by one
200
+ paths = open(self.eval_file, 'r').read().splitlines()
201
+ paths = [p.split('\t')[0] for p in paths]
202
+ for p in paths:
203
+ im = imageio.imread(p, mode='RGB')
204
+ im_x = imageio.imresize(im, (512,512,3)) / 255. # resize and normalize
205
+ im_x = np.reshape(im_x, (1,512,512,3))
206
+
207
+ [out1, out2] = sess.run([rooms, close_walls], feed_dict={x: im_x})
208
+ if resize:
209
+ # out1 = imresize(np.squeeze(out1), (im.shape[0], im.shape[1])) # resize back
210
+ # out2 = imresize(np.squeeze(out2), (im.shape[0], im.shape[1])) # resize back
211
+ out1_rgb = ind2rgb(np.squeeze(out1))
212
+ out1_rgb = imageio.imresize(out1_rgb, (im.shape[0], im.shape[1])) # resize back
213
+ out2_rgb = ind2rgb(np.squeeze(out2), color_map=floorplan_boundary_map)
214
+ out2_rgb = imageio.imresize(out2_rgb, (im.shape[0], im.shape[1])) # resize back
215
+ else:
216
+ out1_rgb = ind2rgb(np.squeeze(out1))
217
+ out2_rgb = ind2rgb(np.squeeze(out2), color_map=floorplan_boundary_map)
218
+
219
+ if merge:
220
+ out1 = np.squeeze(out1)
221
+ out2 = np.squeeze(out2)
222
+ out1[out2==2] = 10
223
+ out1[out2==1] = 9
224
+ # out3_rgb = ind2rgb(out1, color_map=floorplan_fuse_map_figure) # use for present
225
+ out3_rgb = ind2rgb(out1, color_map=floorplan_fuse_map) # use for present
226
+
227
+ name = p.split('/')[-1]
228
+ save_path1 = os.path.join(room_dir, name.split('.jpg')[0]+'_rooms.png')
229
+ save_path2 = os.path.join(close_wall_dir, name.split('.jpg')[0]+'_bd_rm.png')
230
+ save_path3 = os.path.join(save_dir, name.split('.jpg')[0]+'_rooms.png')
231
+
232
+ imageio.imwrite(save_path1, out1_rgb)
233
+ imageio.imwrite(save_path2, out2_rgb)
234
+ if merge:
235
+ imageio.imwrite(save_path3, out3_rgb)
236
+ # imsave(save_path4, out4)
237
+
238
+ print('Saving prediction: {}'.format(name))
239
+
240
+ def evaluate(self, sess, epoch, num_of_classes=11):
241
+ x = tf.placeholder(shape=[1, 512, 512, 3], dtype=tf.float32)
242
+ logits1, logits2 = self.forward(x, init_with_pretrain_vgg=False)
243
+ predict_bd = self.convert_one_hot_to_image(logits2, act='softmax', dtype='int')
244
+ predict_room = self.convert_one_hot_to_image(logits1, act='softmax', dtype='int')
245
+
246
+ paths = open(self.eval_file, 'r').read().splitlines()
247
+ image_paths = [p.split('\t')[0] for p in paths] # image
248
+ gt2_paths = [p.split('\t')[2] for p in paths] # 2 denote doors (and windows)
249
+ gt3_paths = [p.split('\t')[3] for p in paths] # 3 denote rooms
250
+ gt4_paths = [p.split('\t')[-1] for p in paths] # last one denote close wall
251
+
252
+ n = len(paths)
253
+
254
+ hist = np.zeros((num_of_classes, num_of_classes))
255
+ for i in range(n):
256
+ im = imageio.imread(image_paths[i], mode='RGB')
257
+ # for fuse label
258
+ dd = imageio.imread(gt2_paths[i], mode='L')
259
+ rr = imageio.imread(gt3_paths[i], mode='RGB')
260
+ cw = imageio.imread(gt4_paths[i], mode='L')
261
+
262
+ im = imageio.imresize(im, (512, 512, 3)) / 255. # normalize input image
263
+ im = np.reshape(im, (1,512,512,3))
264
+ # merge label
265
+ rr = imageio.imresize(rr, (512, 512, 3))
266
+ rr_ind = rgb2ind(rr)
267
+ cw = imageio.imresize(cw, (512, 512)) / 255
268
+ dd = imageio.imresize(dd, (512, 512)) / 255
269
+ cw = (cw>0.5).astype(np.uint8)
270
+ dd = (dd>0.5).astype(np.uint8)
271
+ rr_ind[cw==1] = 10
272
+ rr_ind[dd==1] = 9
273
+
274
+ # merge prediciton
275
+ rm_ind, bd_ind = sess.run([predict_room, predict_bd], feed_dict={x: im})
276
+ rm_ind = np.squeeze(rm_ind)
277
+ bd_ind = np.squeeze(bd_ind)
278
+ rm_ind[bd_ind==2] = 10
279
+ rm_ind[bd_ind==1] = 9
280
+
281
+ hist += fast_hist(rm_ind.flatten(), rr_ind.flatten(), num_of_classes)
282
+
283
+ overall_acc = np.diag(hist).sum() / hist.sum()
284
+ mean_acc = np.diag(hist) / (hist.sum(1) + 1e-6)
285
+ # iu = np.diag(hist) / (hist.sum(1) + 1e-6 + hist.sum(0) - np.diag(hist))
286
+ mean_acc9 = (np.nansum(mean_acc[:7])+mean_acc[-2]+mean_acc[-1]) / 9.
287
+
288
+ file = open('EVAL_'+self.log_dir, 'a')
289
+ print('Model at epoch {}: overall accuracy = {:.4}, mean_acc = {:.4}'.format(epoch, overall_acc, mean_acc9))
290
+ for i in range(mean_acc.shape[0]):
291
+ if i not in [7 ,8]: # ingore class 7 & 8
292
+ print('\t\tepoch {}: {}th label: accuracy = {:.4}'.format(epoch, i, mean_acc[i]))
293
+ file.close()
294
+
295
+ def main(args):
296
+ tf.set_random_seed(seed)
297
+ np.random.seed(seed)
298
+ random.seed(seed)
299
+
300
+ model = MODEL()
301
+
302
+ if args.phase.lower() == 'train':
303
+ loader_dict, num_batch = data_loader_bd_rm_from_tfrecord(batch_size=1)
304
+
305
+ # START TRAINING
306
+ tic = time.time()
307
+ model.train(loader_dict, num_batch)
308
+ toc = time.time()
309
+ print('total training + evaluation time = {} minutes'.format((toc-tic)/60))
310
+ elif args.phase.lower() == 'test':
311
+ model.infer()
312
+ else:
313
+ pass
314
+
315
+ if __name__ == '__main__':
316
+ FLAGS, unparsed = parser.parse_known_args()
317
+ main(FLAGS)
net.py ADDED
@@ -0,0 +1,362 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import numpy as np
2
+ import tensorflow as tf # using tf 1.10.1
3
+
4
+ from tensorflow.contrib.slim.nets import vgg
5
+
6
+ import os
7
+ import sys
8
+ import glob
9
+ import time
10
+ import random
11
+
12
+ from scipy import ndimage
13
+ import imageio
14
+ from PIL import Image
15
+
16
+ sys.path.append('./utils/')
17
+ from rgb_ind_convertor import *
18
+ from util import fast_hist
19
+ from tf_record import read_record, read_bd_rm_record
20
+
21
+ GPU_ID = '0'
22
+
23
+ def data_loader_bd_rm_from_tfrecord(batch_size=1):
24
+ paths = open('../dataset/r3d_train.txt', 'r').read().splitlines()
25
+
26
+ loader_dict = read_bd_rm_record('../dataset/r3d.tfrecords', batch_size=batch_size, size=512)
27
+
28
+ num_batch = len(paths) // batch_size
29
+
30
+ return loader_dict, num_batch
31
+
32
+ class Network(object):
33
+ """docstring for Network"""
34
+ def __init__(self, dtype=tf.float32):
35
+ print('Initial nn network object...')
36
+ self.dtype = dtype
37
+ self.pre_train_restore_map = {'vgg_16/conv1/conv1_1/weights':'FNet/conv1_1/W', # {'checkpoint_scope_var_name':'current_scope_var_name'} shape must be the same
38
+ 'vgg_16/conv1/conv1_1/biases':'FNet/conv1_1/b',
39
+ 'vgg_16/conv1/conv1_2/weights':'FNet/conv1_2/W',
40
+ 'vgg_16/conv1/conv1_2/biases':'FNet/conv1_2/b',
41
+ 'vgg_16/conv2/conv2_1/weights':'FNet/conv2_1/W',
42
+ 'vgg_16/conv2/conv2_1/biases':'FNet/conv2_1/b',
43
+ 'vgg_16/conv2/conv2_2/weights':'FNet/conv2_2/W',
44
+ 'vgg_16/conv2/conv2_2/biases':'FNet/conv2_2/b',
45
+ 'vgg_16/conv3/conv3_1/weights':'FNet/conv3_1/W',
46
+ 'vgg_16/conv3/conv3_1/biases':'FNet/conv3_1/b',
47
+ 'vgg_16/conv3/conv3_2/weights':'FNet/conv3_2/W',
48
+ 'vgg_16/conv3/conv3_2/biases':'FNet/conv3_2/b',
49
+ 'vgg_16/conv3/conv3_3/weights':'FNet/conv3_3/W',
50
+ 'vgg_16/conv3/conv3_3/biases':'FNet/conv3_3/b',
51
+ 'vgg_16/conv4/conv4_1/weights':'FNet/conv4_1/W',
52
+ 'vgg_16/conv4/conv4_1/biases':'FNet/conv4_1/b',
53
+ 'vgg_16/conv4/conv4_2/weights':'FNet/conv4_2/W',
54
+ 'vgg_16/conv4/conv4_2/biases':'FNet/conv4_2/b',
55
+ 'vgg_16/conv4/conv4_3/weights':'FNet/conv4_3/W',
56
+ 'vgg_16/conv4/conv4_3/biases':'FNet/conv4_3/b',
57
+ 'vgg_16/conv5/conv5_1/weights':'FNet/conv5_1/W',
58
+ 'vgg_16/conv5/conv5_1/biases':'FNet/conv5_1/b',
59
+ 'vgg_16/conv5/conv5_2/weights':'FNet/conv5_2/W',
60
+ 'vgg_16/conv5/conv5_2/biases':'FNet/conv5_2/b',
61
+ 'vgg_16/conv5/conv5_3/weights':'FNet/conv5_3/W',
62
+ 'vgg_16/conv5/conv5_3/biases':'FNet/conv5_3/b'}
63
+
64
+ # basic layer
65
+ def _he_uniform(self, shape, regularizer=None, trainable=None, name=None):
66
+ name = 'W' if name is None else name+'/W'
67
+
68
+ # size = (k_h, k_w, in_dim, out_dim)
69
+ kernel_size = np.prod(shape[:2]) # k_h*k_w
70
+ fan_in = shape[-2]*kernel_size # fan_out = shape[-1]*kernel_size
71
+
72
+ # compute the scale value
73
+ s = np.sqrt(1. /fan_in)
74
+
75
+ # create variable and specific GPU device
76
+ with tf.device('/device:GPU:'+GPU_ID):
77
+ w = tf.get_variable(name, shape, dtype=self.dtype,
78
+ initializer=tf.random_uniform_initializer(minval=-s, maxval=s),
79
+ regularizer=regularizer, trainable=trainable)
80
+
81
+ return w
82
+
83
+ def _constant(self, shape, value=0, regularizer=None, trainable=None, name=None):
84
+ name = 'b' if name is None else name+'/b'
85
+
86
+ with tf.device('/device:GPU:'+GPU_ID):
87
+ b = tf.get_variable(name, shape, dtype=self.dtype,
88
+ initializer=tf.constant_initializer(value=value),
89
+ regularizer=regularizer, trainable=trainable)
90
+
91
+ return b
92
+
93
+ def _conv2d(self, tensor, dim, size=3, stride=1, rate=1, pad='SAME', act='relu', norm='none', G=16, bias=True, name='conv'):
94
+ """pre activate => norm => conv
95
+ """
96
+ in_dim = tensor.shape.as_list()[-1]
97
+ size = size if isinstance(size, (tuple, list)) else [size, size]
98
+ stride = stride if isinstance(stride, (tuple, list)) else [1, stride, stride, 1]
99
+ rate = rate if isinstance(rate, (tuple, list)) else [1, rate, rate, 1]
100
+ kernel_shape = [size[0], size[1], in_dim, dim]
101
+
102
+ w = self._he_uniform(kernel_shape, name=name)
103
+ b = self._constant(dim, name=name) if bias else 0
104
+
105
+ if act == 'relu':
106
+ tensor = tf.nn.relu(tensor, name=name+'/relu')
107
+ elif act == 'sigmoid':
108
+ tensor = tf.nn.sigmoid(tensor, name=name+'/sigmoid')
109
+ elif act == 'softplus':
110
+ tensor = tf.nn.softplus(tensor, name=name+'/softplus')
111
+ elif act =='leaky_relu':
112
+ tensor = tf.nn.leaky_relu(tensor, name=name+'/leaky_relu')
113
+ else:
114
+ norm = 'none'
115
+
116
+ if norm == 'gn': # group normalization after acitvation
117
+ # normalize
118
+ # tranpose: [bs, h, w, c] to [bs, c, h, w] following the paper
119
+ x = tf.transpose(tensor, [0, 3, 1, 2])
120
+ N, C, H, W = x.get_shape().as_list()
121
+ G = min(G, C)
122
+ x = tf.reshape(x, [-1, G, C // G, H, W])
123
+ mean, var = tf.nn.moments(x, [2, 3, 4], keep_dims=True)
124
+ x = (x - mean) / tf.sqrt(var + 1e-6)
125
+
126
+ # per channel gamma and beta
127
+ with tf.device('/device:GPU:'+GPU_ID):
128
+ gamma = tf.get_variable(name+'/gamma', [C], dtype=self.dtype, initializer=tf.constant_initializer(1.0))
129
+ beta = tf.get_variable(name+'/beta', [C], dtype=self.dtype, initializer=tf.constant_initializer(0.0))
130
+ gamma = tf.reshape(gamma, [1, C, 1, 1])
131
+ beta = tf.reshape(beta, [1, C, 1, 1])
132
+
133
+ tensor = tf.reshape(x, [-1, C, H, W]) * gamma + beta
134
+ # tranpose: [bs, c, h, w, c] to [bs, h, w, c] following the paper
135
+ tensor = tf.transpose(tensor, [0, 2, 3, 1])
136
+
137
+ out = tf.nn.conv2d(tensor, w, strides=stride, padding=pad, dilations=rate, name=name) + b # default no bias
138
+
139
+ return out
140
+
141
+ def _upconv2d(self, tensor, dim, size=4, stride=2, pad='SAME', act='relu', name='upconv'):
142
+ [batch_size, h, w, in_dim] = tensor.shape.as_list()
143
+
144
+ size = size if isinstance(size, (tuple, list)) else [size, size]
145
+ stride = stride if isinstance(stride, (tuple, list)) else [1, stride, stride, 1]
146
+
147
+ kernel_shape = [size[0], size[1], dim, in_dim]
148
+ W = self._he_uniform(kernel_shape, name=name)
149
+
150
+ if pad == 'SAME':
151
+ out_shape = [batch_size, h*stride[1], w*stride[2], dim]
152
+ else:
153
+ out_shape = [batch_size, (h-1)*stride[1]+size[0],
154
+ (w-1)*stride[2]+size[1], dim]
155
+
156
+ out = tf.nn.conv2d_transpose(tensor, W, output_shape=tf.stack(out_shape),
157
+ strides=stride, padding=pad, name=name)
158
+
159
+ # reset shape information
160
+ out.set_shape(out_shape)
161
+
162
+ if act == 'relu':
163
+ out = tf.nn.relu(out, name=name+'/relu')
164
+ elif act == 'sigmoid':
165
+ out = tf.nn.sigmoid(out, name=name+'/sigmoid')
166
+ else:
167
+ pass
168
+
169
+ return out
170
+
171
+
172
+ def _max_pool2d(self, tensor, size=2, stride=2, pad='VALID'):
173
+ size = size if isinstance(size, (tuple, list)) else [1, size, size, 1]
174
+ stride = stride if isinstance(stride, (tuple, list)) else [1, stride, stride, 1]
175
+ #
176
+ size = [1, size[0], size[1], 1] if len(size)==2 else size
177
+ stride = [1, stride[0], stride[1], 1] if len(stride)==2 else stride
178
+
179
+ out = tf.nn.max_pool(tensor, size, stride, pad)
180
+
181
+ return out
182
+
183
+ # following three function used for combining context features
184
+ def _constant_kernel(self, shape, value=1.0, diag=False, flip=False, regularizer=None, trainable=None, name=None):
185
+ name = 'fixed_w' if name is None else name+'/fixed_w'
186
+
187
+ with tf.device('/device:GPU:'+GPU_ID):
188
+ if not diag:
189
+ k = tf.get_variable(name, shape, dtype=self.dtype,
190
+ initializer=tf.constant_initializer(value=value),
191
+ regularizer=regularizer, trainable=trainable)
192
+ else:
193
+ w = tf.eye(shape[0], num_columns=shape[1])
194
+ if flip:
195
+ w = tf.reshape(w, (shape[0], shape[1], 1))
196
+ w = tf.image.flip_left_right(w)
197
+ w = tf.reshape(w, shape)
198
+ k = tf.get_variable(name, None, dtype=self.dtype, # constant initializer dont specific shape
199
+ initializer=w,
200
+ regularizer=regularizer, trainable=trainable)
201
+
202
+ return k
203
+
204
+ def _context_conv2d(self, tensor, dim=1, size=7, diag=False, flip=False, stride=1, name='cconv'):
205
+ """
206
+ Implement using identity matrix, combine neighbour pixels without bias, current only accept depth 1 of input tensor
207
+
208
+ Args:
209
+ diag: create diagnoal identity matrix
210
+ transpose: transpose the diagnoal matrix
211
+ """
212
+ in_dim = tensor.shape.as_list()[-1] # suppose to be 1
213
+ size = size if isinstance(size, (tuple, list)) else [size, size]
214
+ stride = stride if isinstance(stride, (tuple, list)) else [1, stride, stride, 1]
215
+ kernel_shape = [size[0], size[1], in_dim, dim]
216
+
217
+ w = self._constant_kernel(kernel_shape, diag=diag, flip=flip, trainable=False, name=name)
218
+ out = tf.nn.conv2d(tensor, w, strides=stride, padding='SAME', name=name)
219
+
220
+ return out
221
+
222
+ def _non_local_context(self, tensor1, tensor2, stride=4, name='non_local_context'):
223
+ """Use 1/stride image size of identity one rank kernel to combine context features, default is half image size, embedding between encoder and decoder part
224
+
225
+ Args:
226
+ stride: define the neighbour size
227
+ """
228
+ assert tensor1.shape.as_list() == tensor2.shape.as_list(), "input tensor should have same shape"
229
+
230
+ [N, H, W, C] = tensor1.shape.as_list()
231
+
232
+ hs = H // stride if (H // stride) > 1 else (stride-1)
233
+ vs = W // stride if (W // stride) > 1 else (stride-1)
234
+
235
+ hs = hs if (hs%2!=0) else hs+1
236
+ vs = hs if (vs%2!=0) else vs+1
237
+
238
+ # compute attention map
239
+ a = self._conv2d(tensor1, dim=C, name=name+'/fa1')
240
+ a = self._conv2d(a, dim=C, name=name+'/fa2')
241
+ a = self._conv2d(a, dim=1, size=1, act='linear', norm=None, name=name+'/a')
242
+ a = tf.nn.sigmoid(a, name=name+'/a_sigmoid')
243
+
244
+ # reduce the tensor depth
245
+ x = self._conv2d(tensor2, dim=C, name=name+'/fx1')
246
+ x = self._conv2d(x, dim=1, size=1, act='linear', norm=None, name=name+'/x')
247
+
248
+ # pre attention, prevent the text
249
+ x = a*x
250
+
251
+ h = self._context_conv2d(x, size=[hs, 1], name=name+'/cc_h') # h
252
+ v = self._context_conv2d(x, size=[1, vs], name=name+'/cc_v') # v
253
+ d1 = self._context_conv2d(x, size=[hs, vs], diag=True, name=name+'/cc_d1') # d
254
+ d2 = self._context_conv2d(x, size=[hs, vs], diag=True, flip=True, name=name+'/cc_d2') # d_t
255
+
256
+ # double attention, prevent blurring
257
+ c1 = a*(h+v+d1+d2)
258
+ # c1 = (h+v+d1+d2)
259
+
260
+ # expand to dim
261
+ c1 = self._conv2d(c1, dim=C, size=1, act='linear', norm=None, name=name+'/expand')
262
+ # c1 = self._conv2d(c1, dim=C, name=name+'/conv1') # contextural feature
263
+
264
+ # further convolution to learn richer feature
265
+ features = tf.concat([tensor2, c1], axis=3, name=name+'/in_context_concat')
266
+ out = self._conv2d(features, dim=C, name=name+'/conv2')
267
+
268
+ # return out, a
269
+ return out, None
270
+
271
+ def _up_bilinear(self, tensor, dim, shape, name='upsample'):
272
+ # [N, H, W, C] = tensor.shape.as_list()
273
+
274
+ out = self._conv2d(tensor, dim=dim, size=1, act='linear', name=name+'/1x1_conv')
275
+ return tf.image.resize_images(out, shape)
276
+
277
+
278
+ def forward(self, inputs, init_with_pretrain_vgg=False, pre_trained_model='./vgg16/vgg_16.ckpt'):
279
+ # feature extraction part and also the share network
280
+ reuse_fnet = len([v for v in tf.global_variables() if v.name.startswith('FNet')]) > 0
281
+ with tf.variable_scope('FNet', reuse=reuse_fnet):
282
+ # feature extraction
283
+ self.conv1_1 = self._conv2d(inputs, dim=64, name='conv1_1')
284
+ self.conv1_2 = self._conv2d(self.conv1_1, dim=64, name='conv1_2')
285
+ self.pool1 = self._max_pool2d(self.conv1_2) # 256 => /2
286
+
287
+ self.conv2_1 = self._conv2d(self.pool1, dim=128, name='conv2_1')
288
+ self.conv2_2 = self._conv2d(self.conv2_1, dim=128, name='conv2_2')
289
+ self.pool2 = self._max_pool2d(self.conv2_2) # 128 => /4
290
+
291
+ self.conv3_1 = self._conv2d(self.pool2, dim=256, name='conv3_1')
292
+ self.conv3_2 = self._conv2d(self.conv3_1, dim=256, name='conv3_2')
293
+ self.conv3_3 = self._conv2d(self.conv3_2, dim=256, name='conv3_3')
294
+ self.pool3 = self._max_pool2d(self.conv3_3) # 64 => /8
295
+
296
+ self.conv4_1 = self._conv2d(self.pool3, dim=512, name='conv4_1')
297
+ self.conv4_2 = self._conv2d(self.conv4_1, dim=512, name='conv4_2')
298
+ self.conv4_3 = self._conv2d(self.conv4_2, dim=512, name='conv4_3')
299
+ self.pool4 = self._max_pool2d(self.conv4_3) # 32 => /16
300
+
301
+ self.conv5_1 = self._conv2d(self.pool4, dim=512, name='conv5_1')
302
+ self.conv5_2 = self._conv2d(self.conv5_1, dim=512, name='conv5_2')
303
+ self.conv5_3 = self._conv2d(self.conv5_2, dim=512, name='conv5_3')
304
+ self.pool5 = self._max_pool2d(self.conv5_3) # 16 => /32
305
+
306
+ # init feature extraction part from pre-train vgg16
307
+ if init_with_pretrain_vgg:
308
+ tf.train.init_from_checkpoint(pre_trained_model, self.pre_train_restore_map)
309
+
310
+ # input size for logits predict
311
+ [n, h, w, c] = inputs.shape.as_list()
312
+
313
+ reuse_cw_net = len([v for v in tf.global_variables() if v.name.startswith('CWNet')]) > 0
314
+ with tf.variable_scope('CWNet', reuse=reuse_cw_net):
315
+ # upsample
316
+ up2 = (self._upconv2d(self.pool5, dim=256, act='linear', name='up2_1') # 32 => /16
317
+ + self._conv2d(self.pool4, dim=256, act='linear', name='pool4_s'))
318
+ self.up2_cw = self._conv2d(up2, dim=256, name='up2_3')
319
+
320
+ up4 = (self._upconv2d(self.up2_cw, dim=128, act='linear', name='up4_1') # 64 => /8
321
+ + self._conv2d(self.pool3, dim=128, act='linear', name='pool3_s'))
322
+ self.up4_cw = self._conv2d(up4, dim=128, name='up4_3')
323
+
324
+ up8 = (self._upconv2d(self.up4_cw, dim=64, act='linear', name='up8_1') # 128 => /4
325
+ + self._conv2d(self.pool2, dim=64, act='linear', name='pool2_s'))
326
+ self.up8_cw = self._conv2d(up8, dim=64, name='up8_2')
327
+
328
+ up16 = (self._upconv2d(self.up8_cw, dim=32, act='linear', name='up16_1') # 256 => /2
329
+ + self._conv2d(self.pool1, dim=32, act='linear', name='pool1_s'))
330
+ self.up16_cw = self._conv2d(up16, dim=32, name='up16_2')
331
+
332
+ # predict logits
333
+ logits_cw = self._up_bilinear(self.up16_cw, dim=3, shape=(h, w), name='logits')
334
+
335
+ # decode network for room type detection
336
+ reuse_rnet = len([v for v in tf.global_variables() if v.name.startswith('RNet')]) > 0
337
+ with tf.variable_scope('RNet', reuse=reuse_rnet):
338
+ # upsample
339
+ up2 = (self._upconv2d(self.pool5, dim=256, act='linear', name='up2_1') # 32 => /16
340
+ + self._conv2d(self.pool4, dim=256, act='linear', name='pool4_s'))
341
+ up2 = self._conv2d(up2, dim=256, name='up2_2')
342
+ up2, _ = self._non_local_context(self.up2_cw, up2, name='context_up2')
343
+
344
+ up4 = (self._upconv2d(up2, dim=128, act='linear', name='up4_1') # 64 => /8
345
+ + self._conv2d(self.pool3, dim=128, act='linear', name='pool3_s'))
346
+ up4 = self._conv2d(up4, dim=128, name='up4_2')
347
+ up4, _ = self._non_local_context(self.up4_cw, up4, name='context_up4')
348
+
349
+ up8 = (self._upconv2d(up4, dim=64, act='linear', name='up8_1') # 128 => /4
350
+ + self._conv2d(self.pool2, dim=64, act='linear', name='pool2_s'))
351
+ up8 = self._conv2d(up8, dim=64, name='up8_2')
352
+ up8, _ = self._non_local_context(self.up8_cw, up8, name='context_up8')
353
+
354
+ up16 = (self._upconv2d(up8, dim=32, act='linear', name='up16_1') # 256 => /2
355
+ + self._conv2d(self.pool1, dim=32, act='linear', name='pool1_s'))
356
+ up16 = self._conv2d(up16, dim=32, name='up16_2')
357
+ self.up16_r, self.a = self._non_local_context(self.up16_cw, up16, name='context_up16')
358
+
359
+ # predict logits
360
+ logits_r = self._up_bilinear(self.up16_r, dim=9, shape=(h, w), name='logits')
361
+
362
+ return logits_r, logits_cw
postprocess.py ADDED
@@ -0,0 +1,83 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import argparse
2
+
3
+ import os
4
+ import sys
5
+ import glob
6
+ import time
7
+ import random
8
+
9
+ import numpy as np
10
+ import imageio
11
+ from PIL import Image
12
+ from matplotlib import pyplot as plt
13
+
14
+ sys.path.append('./utils/')
15
+ from rgb_ind_convertor import *
16
+ from util import *
17
+
18
+ parser = argparse.ArgumentParser()
19
+
20
+ parser.add_argument('--result_dir', type=str, default='./out',
21
+ help='The folder that save network predictions.')
22
+
23
+ def post_process(input_dir, save_dir, merge=True):
24
+ if not os.path.exists(save_dir):
25
+ os.mkdir(save_dir)
26
+
27
+ input_paths = sorted(glob.glob(os.path.join(input_dir, '*.png')))
28
+ names = [i.split('/')[-1] for i in input_paths]
29
+ out_paths = [os.path.join(save_dir, i) for i in names]
30
+
31
+ n = len(input_paths)
32
+ # n = 1
33
+ for i in range(n):
34
+ im = imageio.imread(input_paths[i], mode='RGB')
35
+ im_ind = rgb2ind(im, color_map=floorplan_fuse_map)
36
+ # seperate image into room-seg & boundary-seg
37
+ rm_ind = im_ind.copy()
38
+ rm_ind[im_ind==9] = 0
39
+ rm_ind[im_ind==10] = 0
40
+
41
+ bd_ind = np.zeros(im_ind.shape, dtype=np.uint8)
42
+ bd_ind[im_ind==9] = 9
43
+ bd_ind[im_ind==10] = 10
44
+
45
+ hard_c = (bd_ind>0).astype(np.uint8)
46
+
47
+ # region from room prediction it self
48
+ rm_mask = np.zeros(rm_ind.shape)
49
+ rm_mask[rm_ind>0] = 1
50
+ # region from close_wall line
51
+ cw_mask = hard_c
52
+ # refine close wall mask by filling the grap between bright line
53
+ cw_mask = fill_break_line(cw_mask)
54
+
55
+ fuse_mask = cw_mask + rm_mask
56
+ fuse_mask[fuse_mask>=1] = 255
57
+
58
+ # refine fuse mask by filling the hole
59
+ fuse_mask = flood_fill(fuse_mask)
60
+ fuse_mask = fuse_mask // 255
61
+
62
+ # one room one label
63
+ new_rm_ind = refine_room_region(cw_mask, rm_ind)
64
+
65
+ # ignore the background mislabeling
66
+ new_rm_ind = fuse_mask*new_rm_ind
67
+
68
+ print('Saving {}th refined room prediciton to {}'.format(i, out_paths[i]))
69
+ if merge:
70
+ new_rm_ind[bd_ind==9] = 9
71
+ new_rm_ind[bd_ind==10] = 10
72
+ rgb = ind2rgb(new_rm_ind, color_map=floorplan_fuse_map)
73
+ else:
74
+ rgb = ind2rgb(new_rm_ind)
75
+ imageio.imwrite(out_paths[i], rgb)
76
+
77
+ if __name__ == '__main__':
78
+ FLAGS, unparsed = parser.parse_known_args()
79
+
80
+ input_dir = FLAGS.result_dir
81
+ save_dir = os.path.join(input_dir, 'post')
82
+
83
+ post_process(input_dir, save_dir)
pretrained/download_links.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ download link: https://mycuhk-my.sharepoint.com/:f:/g/personal/1155052510_link_cuhk_edu_hk/EgyJhisy04hNnxKncWl5zksBf9zDKDpMJ7c0V-q53_pxuA?e=P0BjZd
requirements.txt ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ tensorflow==1.15.5
2
+ numpy>=1.18.0
3
+ Pillow>=8.0.0
4
+ imageio>=2.9.0
5
+ gradio>=3.0.0
6
+ matplotlib>=3.0.0
7
+ scipy>=1.4.0
scores.py ADDED
@@ -0,0 +1,123 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import argparse
2
+ import numpy as np
3
+ import imageio
4
+ from PIL import Image
5
+
6
+ import os
7
+ import sys
8
+ import glob
9
+ import time
10
+
11
+ sys.path.append('./utils/')
12
+ from metrics import fast_hist
13
+ from rgb_ind_convertor import *
14
+
15
+ parser = argparse.ArgumentParser()
16
+
17
+ parser.add_argument('--dataset', type=str, default='R3D',
18
+ help='define the benchmark')
19
+
20
+ parser.add_argument('--result_dir', type=str, default='./out',
21
+ help='define the storage folder of network prediction')
22
+
23
+ def evaluate_semantic(benchmark_path, result_dir, num_of_classes=11, need_merge_result=False, im_downsample=False, gt_downsample=False):
24
+ gt_paths = open(benchmark_path, 'r').read().splitlines()
25
+ d_paths = [p.split('\t')[2] for p in gt_paths] # 1 denote wall, 2 denote door, 3 denote room
26
+ r_paths = [p.split('\t')[3] for p in gt_paths] # 1 denote wall, 2 denote door, 3 denote room
27
+ cw_paths = [p.split('\t')[-1] for p in gt_paths] # 1 denote wall, 2 denote door, 3 denote room, last one denote close wall
28
+ im_paths = [os.path.join(result_dir, p.split('/')[-1]) for p in r_paths]
29
+ if need_merge_result:
30
+ im_paths = [os.path.join(result_dir+'/room', p.split('/')[-1]) for p in r_paths]
31
+ im_d_paths = [os.path.join(result_dir+'/door', p.split('/')[-1]) for p in d_paths]
32
+ im_cw_paths = [os.path.join(result_dir+'/close_wall', p.split('/')[-1]) for p in cw_paths]
33
+
34
+ n = len(im_paths)
35
+ # n = 1
36
+ hist = np.zeros((num_of_classes, num_of_classes))
37
+ for i in range(n):
38
+ im = imageio.imread(im_paths[i], mode='RGB')
39
+ if need_merge_result:
40
+ im_d = imageio.imread(im_d_paths[i], mode='L')
41
+ im_cw = imageio.imread(im_cw_paths[i], mode='L')
42
+ # create fuse semantic label
43
+ cw = imageio.imread(cw_paths[i], mode='L')
44
+ dd = imageio.imread(d_paths[i], mode='L')
45
+ rr = imageio.imread(r_paths[i], mode='RGB')
46
+
47
+ if im_downsample:
48
+ im = PIL.Image.fromarray(im).resize((512, 512), Image.Resampling.LANCZOS)
49
+ if need_merge_result:
50
+ im_d = PIL.Image.fromarray(im_d).resize((512, 512), Image.Resampling.LANCZOS)
51
+ im_cw = PIL.Image.fromarray(im_cw).resize((512, 512), Image.Resampling.LANCZOS)
52
+ im_d = np.array(im_d) / 255
53
+ im_cw = np.array(im_cw) / 255
54
+
55
+ if gt_downsample:
56
+ cw = PIL.Image.fromarray(cw).resize((512, 512), Image.Resampling.LANCZOS)
57
+ dd = PIL.Image.fromarray(dd).resize((512, 512), Image.Resampling.LANCZOS)
58
+ rr = PIL.Image.fromarray(rr).resize((512, 512, 3), Image.Resampling.LANCZOS)
59
+
60
+ # normalize
61
+ cw = cw / 255
62
+ dd = dd / 255
63
+
64
+ im_ind = rgb2ind(im, color_map=floorplan_fuse_map)
65
+ if im_ind.sum()==0:
66
+ im_ind = rgb2ind(im+1)
67
+ rr_ind = rgb2ind(rr, color_map=floorplan_fuse_map)
68
+ if rr_ind.sum()==0:
69
+ rr_ind = rgb2ind(rr+1)
70
+
71
+ if need_merge_result:
72
+ im_d = (im_d>0.5).astype(np.uint8)
73
+ im_cw = (im_cw>0.5).astype(np.uint8)
74
+ im_ind[im_cw==1] = 10
75
+ im_ind[im_d ==1] = 9
76
+
77
+ # merge the label and produce
78
+ cw = (cw>0.5).astype(np.uint8)
79
+ dd = (dd>0.5).astype(np.uint8)
80
+ rr_ind[cw==1] = 10
81
+ rr_ind[dd==1] = 9
82
+
83
+ name = im_paths[i].split('/')[-1]
84
+ r_name = r_paths[i].split('/')[-1]
85
+
86
+ print('Evaluating {}(im) <=> {}(gt)...'.format(name, r_name))
87
+
88
+ hist += fast_hist(im_ind.flatten(), rr_ind.flatten(), num_of_classes)
89
+
90
+ print('*'*60)
91
+ # overall accuracy
92
+ acc = np.diag(hist).sum() / hist.sum()
93
+ print('overall accuracy {:.4}'.format(acc))
94
+ # per-class accuracy, avoid div zero
95
+ acc = np.diag(hist) / (hist.sum(1) + 1e-6)
96
+ print('room-type: mean accuracy {:.4}, room-type+bd: mean accuracy {:.4}'.format(np.nanmean(acc[:7]), (np.nansum(acc[:7])+np.nansum(acc[-2:]))/9.))
97
+ for t in range(0, acc.shape[0]):
98
+ if t not in [7, 8]:
99
+ print('room type {}th, accuracy = {:.4}'.format(t, acc[t]))
100
+
101
+ print('*'*60)
102
+ # per-class IU, avoid div zero
103
+ iu = np.diag(hist) / (hist.sum(1) + 1e-6 + hist.sum(0) - np.diag(hist))
104
+ print('room-type: mean IoU {:.4}, room-type+bd: mean IoU {:.4}'.format(np.nanmean(iu[:7]), (np.nansum(iu[:7])+np.nansum(iu[-2:]))/9.))
105
+ for t in range(iu.shape[0]):
106
+ if t not in [7,8]: # ignore class 7 & 8
107
+ print('room type {}th, IoU = {:.4}'.format(t, iu[t]))
108
+
109
+ if __name__ == '__main__':
110
+ FLAGS, unparsed = parser.parse_known_args()
111
+
112
+ if FLAGS.dataset.lower() == 'r2v':
113
+ benchmark_path = './dataset/r2v_test.txt'
114
+ else:
115
+ benchmark_path = './dataset/r3d_test.txt'
116
+
117
+ result_dir = FLAGS.result_dir
118
+
119
+ tic = time.time()
120
+ evaluate_semantic(benchmark_path, result_dir, need_merge_result=False, im_downsample=False, gt_downsample=True) # same as previous line but 11 classes by combining the opening and wall line
121
+
122
+ print("*"*60)
123
+ print("Evaluate time: {} sec".format(time.time()-tic))
utils/create_tfrecord.py ADDED
@@ -0,0 +1,65 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Please prepare the raw image datas save to one folder,
3
+ makesure the path is match to the train_file/test_file.
4
+ """
5
+
6
+ from tf_record import *
7
+ import imageio
8
+ from PIL import Image
9
+
10
+ train_file = '../dataset/r2v_train.txt'
11
+ test_file = '../dataset/r2v_test.txt'
12
+
13
+ # debug
14
+ if __name__ == '__main__':
15
+ # write to TFRecord
16
+ train_paths = open(train_file, 'r').read().splitlines()
17
+ # test_paths = open(test_file, 'r').read().splitlines()
18
+
19
+ # write_record(train_paths, name='../dataset/jp_train.tfrecords')
20
+ # write_record(test_paths, name='../dataset/newyork_test.tfrecords')
21
+
22
+ # write_seg_record(train_paths, name='../dataset/jp_seg_train.tfrecords')
23
+ # write_seg_record(train_paths, name='../dataset/newyork_seg_train.tfrecords')
24
+
25
+ write_bd_rm_record(train_paths, name='../dataset/jp_train.tfrecords')
26
+ # write_bd_rm_record(train_paths, name='../dataset/all_train3.tfrecords')
27
+
28
+ # read from TFRecord
29
+ # loader_list = read_record('../dataset/jp_train.tfrecords')
30
+ # loader_list = read_seg_record('../dataset/jp_seg_train.tfrecords')
31
+
32
+ # loader_list = read_bd_rm_record('../dataset/newyork_bd_rm_train.tfrecords')
33
+ # loader_list = read_bd_rm_record('../dataset/jp_bd_rm_train.tfrecords')
34
+
35
+ # images = loader_list['images']
36
+ # bd_ind = loader_list['label_boundaries']
37
+ # rm_ind = loader_list['label_rooms']
38
+
39
+ # with tf.Session() as sess:
40
+ # # init all variables in graph
41
+ # sess.run(tf.group(tf.global_variables_initializer(),
42
+ # tf.local_variables_initializer()))
43
+
44
+ # coord = tf.train.Coordinator()
45
+ # threads = tf.train.start_queue_runners(sess=sess, coord=coord)
46
+
47
+ # image, bd, rm = sess.run([images, bd_ind, rm_ind])
48
+
49
+ # print 'sess run image shape = ',image.shape
50
+ # print 'sess run wall shape = ', bd.shape
51
+ # print 'sess run room shape =', rm.shape
52
+
53
+ # bd = np.argmax(np.squeeze(bd), axis=-1)
54
+ # rm = np.argmax(np.squeeze(rm), axis=-1)
55
+ # plt.subplot(231)
56
+ # plt.imshow(np.squeeze(image))
57
+ # plt.subplot(233)
58
+ # plt.imshow(bd)
59
+ # plt.subplot(234)
60
+ # plt.imshow(rm)
61
+ # plt.show()
62
+
63
+ # coord.request_stop()
64
+ # coord.join(threads)
65
+ # sess.close()
utils/rgb_ind_convertor.py ADDED
@@ -0,0 +1,79 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import numpy as np
2
+ from PIL import Image
3
+
4
+ # use for index 2 rgb
5
+ floorplan_room_map = {
6
+ 0: [ 0, 0, 0], # background
7
+ 1: [192,192,224], # closet
8
+ 2: [192,255,255], # bathroom/washroom
9
+ 3: [224,255,192], # livingroom/kitchen/diningroom
10
+ 4: [255,224,128], # bedroom
11
+ 5: [255,160, 96], # hall
12
+ 6: [255,224,224], # balcony
13
+ 7: [224,224,224], # not used
14
+ 8: [224,224,128] # not used
15
+ }
16
+
17
+ # boundary label
18
+ floorplan_boundary_map = {
19
+ 0: [ 0, 0, 0], # background
20
+ 1: [255,60,128], # opening (door&window)
21
+ 2: [255,255,255] # wall line
22
+ }
23
+
24
+ # boundary label for presentation
25
+ floorplan_boundary_map_figure = {
26
+ 0: [255,255,255], # background
27
+ 1: [255, 60,128], # opening (door&window)
28
+ 2: [ 0, 0, 0] # wall line
29
+ }
30
+
31
+ # merge all label into one multi-class label
32
+ floorplan_fuse_map = {
33
+ 0: [ 0, 0, 0], # background
34
+ 1: [192,192,224], # closet
35
+ 2: [192,255,255], # batchroom/washroom
36
+ 3: [224,255,192], # livingroom/kitchen/dining room
37
+ 4: [255,224,128], # bedroom
38
+ 5: [255,160, 96], # hall
39
+ 6: [255,224,224], # balcony
40
+ 7: [224,224,224], # not used
41
+ 8: [224,224,128], # not used
42
+ 9: [255,60,128], # extra label for opening (door&window)
43
+ 10: [255,255,255] # extra label for wall line
44
+ }
45
+
46
+ # invert the color of wall line and background for presentation
47
+ floorplan_fuse_map_figure = {
48
+ 0: [255,255,255], # background
49
+ 1: [192,192,224], # closet
50
+ 2: [192,255,255], # batchroom/washroom
51
+ 3: [224,255,192], # livingroom/kitchen/dining room
52
+ 4: [255,224,128], # bedroom
53
+ 5: [255,160, 96], # hall
54
+ 6: [255,224,224], # balcony
55
+ 7: [224,224,224], # not used
56
+ 8: [224,224,128], # not used
57
+ 9: [255,60,128], # extra label for opening (door&window)
58
+ 10: [ 0, 0, 0] # extra label for wall line
59
+ }
60
+
61
+ def rgb2ind(im, color_map=floorplan_room_map):
62
+ ind = np.zeros((im.shape[0], im.shape[1]))
63
+
64
+ for i, rgb in color_map.items():
65
+ ind[(im==rgb).all(2)] = i
66
+
67
+ # return ind.astype(int) # int => int64
68
+ return ind.astype(np.uint8) # force to uint8
69
+
70
+ def ind2rgb(ind_im, color_map=floorplan_room_map):
71
+ rgb_im = np.zeros((ind_im.shape[0], ind_im.shape[1], 3))
72
+
73
+ for i, rgb in color_map.items():
74
+ rgb_im[(ind_im==i)] = rgb
75
+
76
+ return rgb_im
77
+
78
+ def unscale_imsave(path, im, cmin=0, cmax=255):
79
+ Image.fromarray(im, 'L').save(path)
utils/tf_record.py ADDED
@@ -0,0 +1,357 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import numpy as np
2
+
3
+ import tensorflow as tf
4
+
5
+ import imageio
6
+ from PIL import Image
7
+ from matplotlib import pyplot as plt
8
+ from rgb_ind_convertor import *
9
+
10
+ import os
11
+ import sys
12
+ import glob
13
+ import time
14
+
15
+ def load_raw_images(path):
16
+ paths = path.split('\t')
17
+
18
+ image = imageio.imread(paths[0], mode='RGB')
19
+ wall = imageio.imread(paths[1], mode='L')
20
+ close = imageio.imread(paths[2], mode='L')
21
+ room = imageio.imread(paths[3], mode='RGB')
22
+ close_wall = imageio.imread(paths[4], mode='L')
23
+
24
+ # NOTE: imresize will rescale the image to range [0, 255], also cast data into uint8 or uint32
25
+ image = PIL.Image.fromarray(image).resize((512, 512), Image.BICUBIC)
26
+ wall = PIL.Image.fromarray(wall).resize((512, 512), Image.BICUBIC)
27
+ close = PIL.Image.fromarray(close).resize((512, 512), Image.BICUBIC)
28
+ close_wall = PIL.Image.fromarray(close_wall).resize((512, 512), Image.BICUBIC)
29
+ room = PIL.Image.fromarray(room).resize((512, 512), Image.BICUBIC)
30
+
31
+ room_ind = rgb2ind(room)
32
+
33
+ # make sure the dtype is uint8
34
+ image = np.array(image).astype(np.uint8)
35
+ wall = np.array(wall).astype(np.uint8)
36
+ close = np.array(close).astype(np.uint8)
37
+ close_wall = np.array(close_wall).astype(np.uint8)
38
+ room_ind = room_ind.astype(np.uint8)
39
+
40
+ # debug
41
+ # plt.subplot(231)
42
+ # plt.imshow(image)
43
+ # plt.subplot(233)
44
+ # plt.imshow(wall, cmap='gray')
45
+ # plt.subplot(234)
46
+ # plt.imshow(close, cmap='gray')
47
+ # plt.subplot(235)
48
+ # plt.imshow(room_ind)
49
+ # plt.subplot(236)
50
+ # plt.imshow(close_wall, cmap='gray')
51
+ # plt.show()
52
+
53
+ return image, wall, close, room_ind, close_wall
54
+
55
+ def _int64_feature(value):
56
+ return tf.train.Feature(int64_list=tf.train.Int64List(value=[value]))
57
+
58
+ def _bytes_feature(value):
59
+ return tf.train.Feature(bytes_list=tf.train.BytesList(value=[value]))
60
+
61
+ def write_record(paths, name='dataset.tfrecords'):
62
+ writer = tf.python_io.TFRecordWriter(name)
63
+
64
+ for i in range(len(paths)):
65
+ # Load the image
66
+ image, wall, close, room_ind, close_wall = load_raw_images(paths[i])
67
+
68
+ # Create a feature
69
+ feature = {'image': _bytes_feature(tf.compat.as_bytes(image.tostring())),
70
+ 'wall': _bytes_feature(tf.compat.as_bytes(wall.tostring())),
71
+ 'close': _bytes_feature(tf.compat.as_bytes(close.tostring())),
72
+ 'room': _bytes_feature(tf.compat.as_bytes(room_ind.tostring())),
73
+ 'close_wall': _bytes_feature(tf.compat.as_bytes(close_wall.tostring()))}
74
+
75
+ # Create an example protocol buffer
76
+ example = tf.train.Example(features=tf.train.Features(feature=feature))
77
+
78
+ # Serialize to string and write on the file
79
+ writer.write(example.SerializeToString())
80
+
81
+ writer.close()
82
+
83
+ def read_record(data_path, batch_size=1, size=512):
84
+ feature = {'image': tf.FixedLenFeature(shape=(), dtype=tf.string),
85
+ 'wall': tf.FixedLenFeature(shape=(), dtype=tf.string),
86
+ 'close': tf.FixedLenFeature(shape=(), dtype=tf.string),
87
+ 'room': tf.FixedLenFeature(shape=(), dtype=tf.string),
88
+ 'close_wall': tf.FixedLenFeature(shape=(), dtype=tf.string)}
89
+
90
+ # Create a list of filenames and pass it to a queue
91
+ filename_queue = tf.train.string_input_producer([data_path], num_epochs=None, shuffle=False, capacity=batch_size*128)
92
+
93
+ # Define a reader and read the next record
94
+ reader = tf.TFRecordReader()
95
+ _, serialized_example = reader.read(filename_queue)
96
+
97
+ # Decode the record read by the reader
98
+ features = tf.parse_single_example(serialized_example, features=feature)
99
+
100
+ # Convert the image data from string back to the numbers
101
+ image = tf.decode_raw(features['image'], tf.uint8)
102
+ wall = tf.decode_raw(features['wall'], tf.uint8)
103
+ close = tf.decode_raw(features['close'], tf.uint8)
104
+ room = tf.decode_raw(features['room'], tf.uint8)
105
+ close_wall = tf.decode_raw(features['close_wall'], tf.uint8)
106
+
107
+ # Cast data
108
+ image = tf.cast(image, dtype=tf.float32)
109
+ wall = tf.cast(wall, dtype=tf.float32)
110
+ close = tf.cast(close, dtype=tf.float32)
111
+ # room = tf.cast(room, dtype=tf.float32)
112
+ close_wall = tf.cast(close_wall, dtype=tf.float32)
113
+
114
+ # Reshape image data into the original shape
115
+ image = tf.reshape(image, [size, size, 3])
116
+ wall = tf.reshape(wall, [size, size, 1])
117
+ close = tf.reshape(close, [size, size, 1])
118
+ room = tf.reshape(room, [size, size])
119
+ close_wall = tf.reshape(close_wall, [size, size, 1])
120
+
121
+
122
+ # Any preprocessing here ...
123
+ # normalize
124
+ image = tf.divide(image, tf.constant(255.0))
125
+ wall = tf.divide(wall, tf.constant(255.0))
126
+ close = tf.divide(close, tf.constant(255.0))
127
+ close_wall = tf.divide(close_wall, tf.constant(255.0))
128
+
129
+ # Genereate one hot room label
130
+ room_one_hot = tf.one_hot(room, 9, axis=-1)
131
+
132
+ # Creates batches by randomly shuffling tensors
133
+ images, walls, closes, rooms, close_walls = tf.train.shuffle_batch([image, wall, close, room_one_hot, close_wall],
134
+ batch_size=batch_size, capacity=batch_size*128, num_threads=1, min_after_dequeue=batch_size*32)
135
+
136
+ # images, walls = tf.train.shuffle_batch([image, wall],
137
+ # batch_size=batch_size, capacity=batch_size*128, num_threads=1, min_after_dequeue=batch_size*32)
138
+
139
+ return {'images': images, 'walls': walls, 'closes': closes, 'rooms': rooms, 'close_walls': close_walls}
140
+ # return {'images': images, 'walls': walls}
141
+
142
+ # ------------------------------------------------------------------------------------------------------------------------------------- *
143
+ # Following are only for segmentation task, merge all label into one
144
+
145
+ def load_seg_raw_images(path):
146
+ paths = path.split('\t')
147
+
148
+ image = imageio.imread(paths[0], mode='RGB')
149
+ close = imageio.imread(paths[2], mode='L')
150
+ room = imageio.imread(paths[3], mode='RGB')
151
+ close_wall = imageio.imread(paths[4], mode='L')
152
+
153
+ # NOTE: imresize will rescale the image to range [0, 255], also cast data into uint8 or uint32
154
+ image = PIL.Image.fromarray(image).resize((512, 512), Image.BICUBIC)
155
+ close = PIL.Image.fromarray(close).resize((512, 512), Image.BICUBIC) / 255
156
+ close_wall = PIL.Image.fromarray(close_wall).resize((512, 512), Image.BICUBIC) / 255
157
+ room = PIL.Image.fromarray(room).resize((512, 512), Image.BICUBIC)
158
+
159
+ room_ind = rgb2ind(room)
160
+
161
+ # merge result
162
+ d_ind = (close>0.5).astype(np.uint8)
163
+ cw_ind = (close_wall>0.5).astype(np.uint8)
164
+ room_ind[cw_ind==1] = 10
165
+ room_ind[d_ind==1] = 9
166
+
167
+ # make sure the dtype is uint8
168
+ image = np.array(image).astype(np.uint8)
169
+ room_ind = room_ind.astype(np.uint8)
170
+
171
+ # debug
172
+ # merge = ind2rgb(room_ind, color_map=floorplan_fuse_map)
173
+ # plt.subplot(131)
174
+ # plt.imshow(image)
175
+ # plt.subplot(132)
176
+ # plt.imshow(room_ind)
177
+ # plt.subplot(133)
178
+ # plt.imshow(merge/256.)
179
+ # plt.show()
180
+
181
+ return image, room_ind
182
+
183
+ def write_seg_record(paths, name='dataset.tfrecords'):
184
+ writer = tf.python_io.TFRecordWriter(name)
185
+
186
+ for i in range(len(paths)):
187
+ # Load the image
188
+ image, room_ind = load_seg_raw_images(paths[i])
189
+
190
+ # Create a feature
191
+ feature = {'image': _bytes_feature(tf.compat.as_bytes(image.tostring())),
192
+ 'label': _bytes_feature(tf.compat.as_bytes(room_ind.tostring()))}
193
+
194
+ # Create an example protocol buffer
195
+ example = tf.train.Example(features=tf.train.Features(feature=feature))
196
+
197
+ # Serialize to string and write on the file
198
+ writer.write(example.SerializeToString())
199
+
200
+ writer.close()
201
+
202
+ def read_seg_record(data_path, batch_size=1, size=512):
203
+ feature = {'image': tf.FixedLenFeature(shape=(), dtype=tf.string),
204
+ 'label': tf.FixedLenFeature(shape=(), dtype=tf.string)}
205
+
206
+ # Create a list of filenames and pass it to a queue
207
+ filename_queue = tf.train.string_input_producer([data_path], num_epochs=None, shuffle=False, capacity=batch_size*128)
208
+
209
+ # Define a reader and read the next record
210
+ reader = tf.TFRecordReader()
211
+ _, serialized_example = reader.read(filename_queue)
212
+
213
+ # Decode the record read by the reader
214
+ features = tf.parse_single_example(serialized_example, features=feature)
215
+
216
+ # Convert the image data from string back to the numbers
217
+ image = tf.decode_raw(features['image'], tf.uint8)
218
+ label = tf.decode_raw(features['label'], tf.uint8)
219
+
220
+ # Cast data
221
+ image = tf.cast(image, dtype=tf.float32)
222
+
223
+ # Reshape image data into the original shape
224
+ image = tf.reshape(image, [size, size, 3])
225
+ label = tf.reshape(label, [size, size])
226
+
227
+
228
+ # Any preprocessing here ...
229
+ # normalize
230
+ image = tf.divide(image, tf.constant(255.0))
231
+
232
+ # Genereate one hot room label
233
+ label_one_hot = tf.one_hot(label, 11, axis=-1)
234
+
235
+ # Creates batches by randomly shuffling tensors
236
+ images, labels = tf.train.shuffle_batch([image, label_one_hot],
237
+ batch_size=batch_size, capacity=batch_size*128, num_threads=1, min_after_dequeue=batch_size*32)
238
+
239
+ # images, walls = tf.train.shuffle_batch([image, wall],
240
+ # batch_size=batch_size, capacity=batch_size*128, num_threads=1, min_after_dequeue=batch_size*32)
241
+
242
+ return {'images': images, 'labels': labels}
243
+
244
+ # --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- *
245
+ # ------------------------------------------------------------------------------------------------------------------------------------- *
246
+ # Following are only for multi-task network. Two labels(boundary and room.)
247
+
248
+ def load_bd_rm_images(path):
249
+ paths = path.split('\t')
250
+
251
+ image = imageio.imread(paths[0], mode='RGB')
252
+ close = imageio.imread(paths[2], mode='L')
253
+ room = imageio.imread(paths[3], mode='RGB')
254
+ close_wall = imageio.imread(paths[4], mode='L')
255
+
256
+ # NOTE: imresize will rescale the image to range [0, 255], also cast data into uint8 or uint32
257
+ image = PIL.Image.fromarray(image).resize((512, 512), Image.BICUBIC)
258
+ close = PIL.Image.fromarray(close).resize((512, 512), Image.BICUBIC) / 255.
259
+ close_wall = PIL.Image.fromarray(close_wall).resize((512, 512), Image.BICUBIC) / 255.
260
+ room = PIL.Image.fromarray(room).resize((512, 512), Image.BICUBIC)
261
+
262
+ room_ind = rgb2ind(room)
263
+
264
+ # merge result
265
+ d_ind = (close>0.5).astype(np.uint8)
266
+ cw_ind = (close_wall>0.5).astype(np.uint8)
267
+
268
+ cw_ind[cw_ind==1] = 2
269
+ cw_ind[d_ind==1] = 1
270
+
271
+ # make sure the dtype is uint8
272
+ image = np.array(image).astype(np.uint8)
273
+ room_ind = room_ind.astype(np.uint8)
274
+ cw_ind = cw_ind.astype(np.uint8)
275
+
276
+ # debugging
277
+ # merge = ind2rgb(room_ind, color_map=floorplan_fuse_map)
278
+ # rm = ind2rgb(room_ind)
279
+ # bd = ind2rgb(cw_ind, color_map=floorplan_boundary_map)
280
+ # plt.subplot(131)
281
+ # plt.imshow(image)
282
+ # plt.subplot(132)
283
+ # plt.imshow(rm/256.)
284
+ # plt.subplot(133)
285
+ # plt.imshow(bd/256.)
286
+ # plt.show()
287
+
288
+ return image, cw_ind, room_ind, d_ind
289
+
290
+ def write_bd_rm_record(paths, name='dataset.tfrecords'):
291
+ writer = tf.python_io.TFRecordWriter(name)
292
+
293
+ for i in range(len(paths)):
294
+ # Load the image
295
+ image, cw_ind, room_ind, d_ind = load_bd_rm_images(paths[i])
296
+
297
+ # Create a feature
298
+ feature = {'image': _bytes_feature(tf.compat.as_bytes(image.tostring())),
299
+ 'boundary': _bytes_feature(tf.compat.as_bytes(cw_ind.tostring())),
300
+ 'room': _bytes_feature(tf.compat.as_bytes(room_ind.tostring())),
301
+ 'door': _bytes_feature(tf.compat.as_bytes(d_ind.tostring()))}
302
+
303
+ # Create an example protocol buffer
304
+ example = tf.train.Example(features=tf.train.Features(feature=feature))
305
+
306
+ # Serialize to string and write on the file
307
+ writer.write(example.SerializeToString())
308
+
309
+ writer.close()
310
+
311
+ def read_bd_rm_record(data_path, batch_size=1, size=512):
312
+ feature = {'image': tf.FixedLenFeature(shape=(), dtype=tf.string),
313
+ 'boundary': tf.FixedLenFeature(shape=(), dtype=tf.string),
314
+ 'room': tf.FixedLenFeature(shape=(), dtype=tf.string),
315
+ 'door': tf.FixedLenFeature(shape=(), dtype=tf.string)}
316
+
317
+ # Create a list of filenames and pass it to a queue
318
+ filename_queue = tf.train.string_input_producer([data_path], num_epochs=None, shuffle=False, capacity=batch_size*128)
319
+
320
+ # Define a reader and read the next record
321
+ reader = tf.TFRecordReader()
322
+ _, serialized_example = reader.read(filename_queue)
323
+
324
+ # Decode the record read by the reader
325
+ features = tf.parse_single_example(serialized_example, features=feature)
326
+
327
+ # Convert the image data from string back to the numbers
328
+ image = tf.decode_raw(features['image'], tf.uint8)
329
+ boundary = tf.decode_raw(features['boundary'], tf.uint8)
330
+ room = tf.decode_raw(features['room'], tf.uint8)
331
+ door = tf.decode_raw(features['door'], tf.uint8)
332
+
333
+ # Cast data
334
+ image = tf.cast(image, dtype=tf.float32)
335
+
336
+ # Reshape image data into the original shape
337
+ image = tf.reshape(image, [size, size, 3])
338
+ boundary = tf.reshape(boundary, [size, size])
339
+ room = tf.reshape(room, [size, size])
340
+ door = tf.reshape(door, [size, size])
341
+
342
+ # Any preprocessing here ...
343
+ # normalize
344
+ image = tf.divide(image, tf.constant(255.0))
345
+
346
+ # Genereate one hot room label
347
+ label_boundary = tf.one_hot(boundary, 3, axis=-1)
348
+ label_room = tf.one_hot(room, 9, axis=-1)
349
+
350
+ # Creates batches by randomly shuffling tensors
351
+ images, label_boundaries, label_rooms, label_doors = tf.train.shuffle_batch([image, label_boundary, label_room, door],
352
+ batch_size=batch_size, capacity=batch_size*128, num_threads=1, min_after_dequeue=batch_size*32)
353
+
354
+ # images, walls = tf.train.shuffle_batch([image, wall],
355
+ # batch_size=batch_size, capacity=batch_size*128, num_threads=1, min_after_dequeue=batch_size*32)
356
+
357
+ return {'images': images, 'label_boundaries': label_boundaries, 'label_rooms': label_rooms, 'label_doors': label_doors}
utils/util.py ADDED
@@ -0,0 +1,65 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import cv2
2
+ import numpy as np
3
+ from scipy import ndimage
4
+
5
+ def fast_hist(im, gt, n=9):
6
+ """
7
+ n is num_of_classes
8
+ """
9
+ k = (gt >= 0) & (gt < n)
10
+ return np.bincount(n * gt[k].astype(int) + im[k], minlength=n**2).reshape(n, n)
11
+
12
+ def flood_fill(test_array, h_max=255):
13
+ """
14
+ fill in the hole
15
+ """
16
+ input_array = np.copy(test_array)
17
+ el = ndimage.generate_binary_structure(2,2).astype(int)
18
+ inside_mask = ndimage.binary_erosion(~np.isnan(input_array), structure=el)
19
+ output_array = np.copy(input_array)
20
+ output_array[inside_mask]=h_max
21
+ output_old_array = np.copy(input_array)
22
+ output_old_array.fill(0)
23
+ el = ndimage.generate_binary_structure(2,1).astype(int)
24
+ while not np.array_equal(output_old_array, output_array):
25
+ output_old_array = np.copy(output_array)
26
+ output_array = np.maximum(input_array,ndimage.grey_erosion(output_array, size=(3,3), footprint=el))
27
+ return output_array
28
+
29
+ def fill_break_line(cw_mask):
30
+ broken_line_h = np.array([[0,0,0,0,0],
31
+ [0,0,0,0,0],
32
+ [1,0,0,0,1],
33
+ [0,0,0,0,0],
34
+ [0,0,0,0,0]], dtype=np.uint8)
35
+ broken_line_h2 = np.array([[0,0,0,0,0],
36
+ [0,0,0,0,0],
37
+ [1,1,0,1,1],
38
+ [0,0,0,0,0],
39
+ [0,0,0,0,0]], dtype=np.uint8)
40
+ broken_line_v = np.transpose(broken_line_h)
41
+ broken_line_v2 = np.transpose(broken_line_h2)
42
+ cw_mask = cv2.morphologyEx(cw_mask, cv2.MORPH_CLOSE, broken_line_h)
43
+ cw_mask = cv2.morphologyEx(cw_mask, cv2.MORPH_CLOSE, broken_line_v)
44
+ cw_mask = cv2.morphologyEx(cw_mask, cv2.MORPH_CLOSE, broken_line_h2)
45
+ cw_mask = cv2.morphologyEx(cw_mask, cv2.MORPH_CLOSE, broken_line_v2)
46
+
47
+ return cw_mask
48
+
49
+ def refine_room_region(cw_mask, rm_ind):
50
+ label_rm, num_label = ndimage.label((1-cw_mask))
51
+ new_rm_ind = np.zeros(rm_ind.shape)
52
+ for j in range(1, num_label+1):
53
+ mask = (label_rm == j).astype(np.uint8)
54
+ ys, xs = np.where(mask!=0)
55
+ area = (np.amax(xs)-np.amin(xs))*(np.amax(ys)-np.amin(ys))
56
+ if area < 100:
57
+ continue
58
+ else:
59
+ room_types, type_counts = np.unique(mask*rm_ind, return_counts=True)
60
+ if len(room_types) > 1:
61
+ room_types = room_types[1:] # ignore background type which is zero
62
+ type_counts = type_counts[1:] # ignore background count
63
+ new_rm_ind += mask*room_types[np.argmax(type_counts)]
64
+
65
+ return new_rm_ind